Skip to content

Commit

Permalink
Documentation edits (#1996)
Browse files Browse the repository at this point in the history
* Edits part 1

* Edits part 2

* Apply suggestions from code review

Co-authored-by: Kathy Pippert <[email protected]>

* Apply suggestions from code review

Co-authored-by: Kathy Pippert <[email protected]>

* Apply suggestions from code review

Co-authored-by: Kathy Pippert <[email protected]>

* Apply suggestions from code review

Co-authored-by: Kathy Pippert <[email protected]>

* Changes from code review

* Review changes

* Apply suggestions from code review

Co-authored-by: Paul Profizi <[email protected]>

---------

Co-authored-by: Kathy Pippert <[email protected]>
Co-authored-by: Paul Profizi <[email protected]>
  • Loading branch information
3 people authored Jan 9, 2025
1 parent 3883355 commit a187839
Show file tree
Hide file tree
Showing 18 changed files with 137 additions and 144 deletions.
2 changes: 1 addition & 1 deletion doc/source/_static/simple_example.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Here's how you would open a result file generated by Ansys MAPDL (or another Ansys solver) and
The following example shows how to open a result file generated by Ansys MAPDL (or another Ansys solver) and
extract results:

.. code-block:: default
Expand Down
10 changes: 5 additions & 5 deletions doc/source/getting_started/compatibility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Operating system
----------------

DPF supports Windows 10 and Rocky Linux 8 and later.
To run DPF on CentOS 7, use DPF for 2024R2 (8.2) or older.
To run DPF on CentOS 7, use DPF for 2024 R2 (8.2) or later.
For more information, see `Ansys Platform Support <https://www.ansys.com/solutions/solutions-by-role/it-professionals/platform-support>`_.

Client-server
Expand All @@ -23,8 +23,8 @@ version.

As new features are developed, every attempt is made to ensure backward
compatibility from the client to the server. Backward compatibility is generally ensured for
the 4 latest Ansys versions. For example, ``ansys-dpf-core`` module with 0.8.0 version has been
developed for Ansys 2023 R2 pre1 release, for 2023 R2 Ansys version. It is compatible with
the four latest Ansys versions. For example, the ``ansys-dpf-core`` module 0.8.0 has been
developed for the Ansys 2023 R2 version. It is compatible with
2023 R2, 2023 R1, 2022 R2 and 2022 R1 Ansys versions.

Starting with version ``0.10`` of ``ansys-dpf-core``, the packages ``ansys-dpf-gate``,
Expand All @@ -34,8 +34,8 @@ and prevent synchronization issues between the PyDPF libraries, requiring to dro
previous to 2022 R2.

**Ansys strongly encourages you to use the latest packages available**, as far they are compatible
with the Server version you want to run. Considering Ansys 2023 R1 for example, if ``ansys-dpf-core``
module with 0.10.0 version is the latest available compatible package, it should be used.
with the server version you want to run. Considering Ansys 2023 R1 for example, if ``ansys-dpf-core``
module 0.10.0 is the latest available compatible package, it should be used.

For ``ansys-dpf-core<0.10``, the `ansys.grpc.dpf <https://pypi.org/project/ansys-grpc-dpf/>`_
package should also be synchronized with the server version.
Expand Down
15 changes: 8 additions & 7 deletions doc/source/getting_started/dpf_server.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ The first standalone version of DPF Server available is 6.0 (2023 R2).

The sections on this page describe how to install and use a standalone DPF Server.

* For a quick start on using PyDPF, see :ref:`ref_getting_started`.
* For a brief overview on using PyDPF, see :ref:`ref_getting_started`.
* For more information on DPF and its use, see :ref:`ref_user_guide`.


Expand Down Expand Up @@ -65,7 +65,7 @@ PyDPF-Core is a Python client API communicating with a **DPF Server**, either
through the network using gRPC or directly in the same process. PyDPF-Post is a Python
module for postprocessing based on PyDPF-Core.

Both PyDPF-Core and PyDPF-Post can be used with DPF Server. Installation instructions
Both PyDPF-Core and PyDPF-Post can be used with the DPF Server. Installation instructions
for PyDPF-Core are available in the PyDPF-Core `Getting started <https://dpf.docs.pyansys.com/version/stable/getting_started/install.html>`_.
Installation instructions for PyDPF-Post are available in the PyDPF-Post `Getting started <https://post.docs.pyansys.com/version/stable/getting_started/install.html>`_.

Expand Down Expand Up @@ -98,10 +98,10 @@ to use thanks to its ``ansys_path`` argument.
PyDPF otherwise follows the logic below to automatically detect and choose which locally installed
version of DPF Server to run:

- it uses the ``ANSYS_DPF_PATH`` environment variable in priority if set and targeting a valid path to a DPF Server installation.
- it then checks the currently active Python environment for any installed standalone DPF Server, and uses the latest version available.
- it then checks for ``AWP_ROOTXXX`` environment variables, which are set by the **Ansys installer**, and uses the latest version available.
- if then raises an error if all of the steps above failed to return a valid path to a DPF Server installation.
- It uses the ``ANSYS_DPF_PATH`` environment variable in priority if set and targeting a valid path to a DPF Server installation.
- It then checks the currently active Python environment for any installed standalone DPF Server, and uses the latest version available.
- It then checks for ``AWP_ROOTXXX`` environment variables, which are set by the **Ansys installer**, and uses the latest version available.
- It then raises an error if all of the preceding steps failed to return a valid path to a DPF Server installation.

Run DPF Server in a Docker container
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand All @@ -111,7 +111,8 @@ DPF Server can be run in a Docker container.
in :ref:`Install DPF Server <target_installing_server>`, download the ``Dockerfile`` file.
#. Optional: download any other plugin ZIP file as appropriate. For example, to access the ``composites`` plugin for Linux,
download ``ansys_dpf_composites_lin_v2025.1.pre0.zip``.
#. Copy all the ZIP files and ``Dockerfile`` file in a folder and navigate into that folder.
#. Copy all the ZIP files and the ``Dockerfile`` file into a folder.
#. Navigate into the folder used in the previous step.
#. To build the DPF Docker container, run the following command:

.. code::
Expand Down
2 changes: 1 addition & 1 deletion doc/source/getting_started/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Getting started

The Data Processing Framework (DPF) provides numerical simulation users and engineers with a toolbox
for accessing and transforming simulation data. DPF can access data from Ansys solver
result files as well as from several neutral (see :ref:`ref_main_index`).
result files as well as from several neutral file formats. For more information, see :ref:`ref_main_index`.

This **workflow-based** framework allows you to perform complex preprocessing and
postprocessing operations on large amounts of simulation data.
Expand Down
10 changes: 5 additions & 5 deletions doc/source/getting_started/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@ with this command:
pip install ansys-dpf-core
PyDPF-Core plotting capabilities require to have `PyVista <https://pyvista.org/>`_ installed.
To install PyDPF-Core with its optional plotting functionalities, use:
PyDPF-Core plotting capabilities require you to have `PyVista <https://pyvista.org/>`_ installed.
To install PyDPF-Core with its optional plotting functionalities, run this command:

.. code::
Expand Down Expand Up @@ -62,10 +62,10 @@ then use the following command from within this local directory:
pip install --no-index --find-links=. ansys-dpf-core
Beware that PyDPF-Core wheelhouses do not include the optional plotting dependencies.
To allow for plotting capabilities, also download the wheels corresponding to your platform and Python interpreter version
Note that PyDPF-Core wheelhouses do not include the optional plotting dependencies.
To use the plotting capabilities, also download the wheels corresponding to your platform and Python interpreter version
for `PyVista <https://pypi.org/project/pyvista/#files>`_ and
`matplotlib <https://pypi.org/project/matplotlib/#files>`_, then place them in the same previous local directory and run the command above.
`matplotlib <https://pypi.org/project/matplotlib/#files>`_. Then, place them in the same local directory and run the preceding command.


Install in development mode
Expand Down
29 changes: 14 additions & 15 deletions doc/source/getting_started/licensing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,10 @@
Licensing
=========

This section details how to properly set up licensing, as well as what the user should expect in
terms of limitations or license usage when running PyDPF scripts.
This section describes how to properly set up licensing, as well as limitations and license usage when running PyDPF scripts.

DPF follows a client-server architecture, which means that the PyDPF client library must interact with a running DPF Server.
It either starts a DPF Server via a local installation of DPF Server, or it connects to an already running local or remote DPF Server.
DPF follows a client-server architecture, so the PyDPF client library must interact with a running DPF Server.
It either starts a DPF Server via a local DPF Server installation, or it connects to an already running local or remote DPF Server.

DPF Server is packaged within the **Ansys installer** in Ansys 2021 R1 and later.
It is also available as a standalone application.
Expand All @@ -20,12 +19,12 @@ For more information on installing DPF Server, see :ref:`ref_dpf_server`.
License terms
-------------

When using the DPF Server from an Ansys installation, the user has already agreed to the licensing
When using the DPF Server from an Ansys installation, you have already agreed to the licensing
terms when installing Ansys.

When using a standalone DPF Server, the user must accept the ``DPF Preview License Agreement``
When using a standalone DPF Server, you must accept the ``DPF Preview License Agreement``
by following the indications below.
Starting a DPF Server without agreeing to the ``DPF Preview License Agreement`` throws an exception.
Starting a DPF Server without agreeing to the ``DPF Preview License Agreement`` creates an exception.

DPF Preview License Agreement
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand All @@ -51,7 +50,7 @@ existing license for the edition and version of DPF Server that you intend to us
Configure licensing
-------------------

If your machine does not have a local Ansys installation, you need to define where DPF should look for a valid license.
If your machine does not have a local Ansys installation, you must define where DPF should look for a valid license.

To use a local license file, set the ``ANSYSLMD_LICENSE_FILE`` environment
variable to point to an Ansys license file ``<license_file_to_use>``:
Expand Down Expand Up @@ -85,12 +84,12 @@ License checks and usage
------------------------

Some DPF operators require DPF to check for an existing license
and some require DPF to check-out a compatible license increment.
and some require DPF to checkout a compatible license increment.

DPF is by default allowed to check-out license increments as needed.
DPF is by default allowed to checkout license increments as needed.
To change this behavior, see :ref:`here <licensing_server_context>`.

To know if operators require a license increment check-out to run, check their ``license``
To know if operators require a license increment checkout to run, check their ``license``
attribute in :ref:`ref_dpf_operators_reference` or directly in Python by checking the operator's
properties for a ``license`` key:

Expand All @@ -109,13 +108,13 @@ properties for a ``license`` key:
To check which Ansys licensing increments correspond to ``any_dpf_supported_increments``,
see :ref:`here<target_to_ansys_license_increments_list>`.
see :ref:`Compatible Ansys license increments<target_to_ansys_license_increments_list>`.

Even if an operator does not require a license check-out to run, most DPF operators still require
Even if an operator does not require a license checkout to run, most DPF operators still require
DPF to check for a reachable license server or license file.

Operators which do not perform any kind of license check are source operators (data extraction
operators) which do not perform any data transformation.
Operators that do not perform any kind of license check are source operators (data extraction
operators). These operators do not perform any data transformation.

For example, when considering result operators, they perform data transformation if the requested
location is not the native result location. In that case, averaging occurs which is considered
Expand Down
21 changes: 10 additions & 11 deletions doc/source/operator_reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,7 @@
Operators
=========

DPF operators provide for manipulating and transforming simulation data.

From DPF Server for Ansys 2023 R2 and later, the licensing logic for operators in DPF depend on the active
`ServerContext <https://dpf.docs.pyansys.com/api/ansys.dpf.core.server_context.html#servercontext>`_.

The available contexts are **Premium** and **Entry**.
Licensed operators are marked as in the documentation using the ``license`` property.
Operators with the ``license`` property as **None** do not require a license check-out.
For more information about using these two contexts, see :ref:`user_guide_server_context`.
Click below to access the operators documentation.
DPF operators allow you to manipulate and transform simulation data.

.. grid:: 1

Expand All @@ -32,9 +23,17 @@ Click below to access the operators documentation.
:click-parent:


For Ansys 2023 R2 and later, the DPF Server licensing logic for operators in DPF depends on the active
`server context<https://dpf.docs.pyansys.com/version/stable/api/ansys.dpf.core.server_context.html#ansys.dpf.core.server_context.ServerContext>`_.

The available contexts are **Premium** and **Entry**.
Licensed operators are marked as such in the documentation using the ``license`` property.
Operators with the ``license`` property set to **None** do not require a license checkout.
For more information on using these two contexts, see :ref:`user_guide_server_context`.

.. note::

For Ansys 2023 R1 and earlier, the context is equivalent to Premium, with all operators loaded.
For Ansys 2023 R1 and earlier, the context is equivalent to **Premium**, with all operators loaded.
For DPF Server 2023.2.pre0 specifically, the server context defines which operators are loaded and
accessible. Use the `PyDPF-Core 0.7 operator documentation <https://dpf.docs.pyansys.com/version/0.7/operator_reference.html>`_ to learn more.
Some operators in the documentation might not be available for a particular server version.
22 changes: 11 additions & 11 deletions doc/source/user_guide/concepts/concepts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,33 +3,33 @@
==================
Terms and concepts
==================
DPF sees **fields of data**, not physical results. This makes DPF a
DPF uses **fields of data**, not physical results. This makes DPF a
very versatile tool that can be used across teams, projects, and
simulations.

Key terms
---------
Here are descriptions for key DPF terms:

- **Data source:** One or more files containing analysis results.
- **Field:** Main simulation data container.
- **Field container:** For a transient, harmonic, modal, or multi-step
- **Data source**: One or more files containing analysis results.
- **Field**: Main simulation data container.
- **Fields container**: For a transient, harmonic, modal, or multi-step
static analysis, a set of fields, with one field for each time step
or frequency.
- **Location:** Type of topology associated with the data container. DPF
- **Location**: Type of topology associated with the data container. DPF
uses three different spatial locations for finite element data: ``Nodal``,
``Elemental``, and ``ElementalNodal``.
- **Operators:** Objects that are used to create, transform, and stream the data.
- **Operators**: Objects that are used to create, transform, and stream the data.
An operator is composed of a **core** and **pins**. The core handles the
calculation, and the pins provide input data to and output data from
the operator.
- **Scoping:** Spatial and/or temporal subset of a model's support.
- **Support:** Physical entity that the field is associated with. For example,
- **Scoping**: Spatial and/or temporal subset of a model's support.
- **Support**: Physical entity that the field is associated with. For example,
the support can be a mesh, geometrical entity, or time or frequency values.
- **Workflow:** Global entity that is used to evaluate the data produced
- **Workflow**: Global entity that is used to evaluate the data produced
by chained operators.
- **Meshed region:** Entity describing a mesh. Node and element scopings,
element types, connectivity (list of node indices composing each element) and
- **Meshed region**: Entity describing a mesh. Node and element scopings,
element types, connectivity (list of node indices composing each element), and
node coordinates are the fundamental entities composing the meshed region.

Scoping
Expand Down
36 changes: 17 additions & 19 deletions doc/source/user_guide/concepts/stepbystep.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Data can come from two sources:
- **Manual input in DPF:** You can create fields of data in DPF.

Once you specify data sources or manually create fields in DPF,
you can create field containers (if applicable) and define scopings to
you can create fields containers (if applicable) and define scopings to
identify the subset of data that you want to evaluate.

Specify the data source
Expand Down Expand Up @@ -103,27 +103,27 @@ This code shows how to define a mesh scoping:
my_scoping.location = "Nodal" #optional
my_scoping.ids = list(range(1,11))
Define field containers
~~~~~~~~~~~~~~~~~~~~~~~
A **field container** holds a set of fields. It is used mainly for
Define fields containers
~~~~~~~~~~~~~~~~~~~~~~~~
A **fields container** holds a set of fields. It is used mainly for
transient, harmonic, modal, or multi-step analyses. This image
explains its structure:

.. image:: ../../images/drawings/field-con-overview.png

A field container is a vector of fields. Fields are ordered with labels
and IDs. Most commonly, a field container is scoped on the time label,
A fields container is a vector of fields. Fields are ordered with labels
and IDs. Most commonly, a fields container is scoped on the time label,
and the IDs are the time or frequency sets:

.. image:: ../../images/drawings/field-con.png

You can define a field container in multiple ways:
You can define a fields container in multiple ways:

- Extract labeled data from a result file.
- Create a field container from a CSV file.
- Convert existing fields to a field container.
- Create a fields container from a CSV file.
- Convert existing fields to a fields container.

This code shows how to define a field container from scratch:
This code shows how to define a fields container from scratch:

.. code-block:: python
Expand All @@ -137,9 +137,9 @@ This code shows how to define a field container from scratch:
mscop = {"time":i+1,"complex":1}
fc.add_field(mscop,dpf.Field(nentities=i+10))
Some operators can operate directly on field containers instead of fields.
Field containers are identified by ``fc`` suffixes in their names.
Operators and field containers are explained in more detail
Some operators can operate directly on fields containers instead of fields.
Fields containers are identified by ``fc`` suffixes in their names.
Operators and fields containers are explained in more detail
in :ref:`transform_the_data`.

.. _transform_the_data:
Expand All @@ -155,18 +155,16 @@ Use operators
You use operators to import, export, transform, and analyze data.

An operator is analogous to an integrated circuit in electronics. It
has a set of input and output pins. Pins provide for passing data to
and from operators.
has a set of input and output pins. Pins pass data to and from operators.

An operator takes input from a field, field container, or scoping using
An operator takes input from a field, fields container, or scoping using
an input pin. Based on what it is designed to do, the operator computes
an output that it passes to a field or field container using an output pin.
an output that it passes to a field or fields container using an output pin.

.. image:: ../../images/drawings/circuit.png

Comprehensive information on operators is available in :ref:`ref_dpf_operators_reference`.
In the **Available Operators** area for either the **Entry** or **Premium** operators,
you can either type a keyword in the **Search** option
In the **Available Operators** area, you can either type a keyword in the **Search** option
or browse by operator categories:

.. image:: ../../images/drawings/help-operators.png
Expand Down
Loading

0 comments on commit a187839

Please sign in to comment.