Skip to content

Commit

Permalink
Merge branch 'master' into feat/exposing_quantity_types
Browse files Browse the repository at this point in the history
  • Loading branch information
a-bouth authored Jan 20, 2025
2 parents 11e6e48 + 404befa commit a9153eb
Show file tree
Hide file tree
Showing 61 changed files with 5,057 additions and 322 deletions.
4 changes: 4 additions & 0 deletions .ci/build_wheel.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@
"win": "win_amd64",
"manylinux1": "manylinux1_x86_64",
"manylinux_2_17": "manylinux_2_17_x86_64",
# Accommodate tox.ini automatic platform substitutions
"linux": "manylinux_2_17_x86_64",
"win32": "win_amd64",
"darwin": "any",
}

argParser = argparse.ArgumentParser()
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/pydpf-post.yml
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@ jobs:
shell: bash
working-directory: pydpf-post/tests
run: |
pytest $DEBUG --reruns 2 .
pytest $DEBUG --maxfail=5 --reruns 2 .
if: always()
timeout-minutes: 60

Expand Down
1 change: 1 addition & 0 deletions doc/make.bat
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ if "%SPHINXBUILD%" == "" (
)
set SOURCEDIR=source
set BUILDDIR=build
set SPHINXOPTS=-n

if "%1" == "" goto help
if "%1" == "clean" goto clean
Expand Down
14 changes: 7 additions & 7 deletions doc/source/_static/dpf_operators.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion doc/source/_static/simple_example.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Here's how you would open a result file generated by Ansys MAPDL (or another Ansys solver) and
The following example shows how to open a result file generated by Ansys MAPDL (or another Ansys solver) and
extract results:

.. code-block:: default
Expand Down
10 changes: 5 additions & 5 deletions doc/source/getting_started/compatibility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Operating system
----------------

DPF supports Windows 10 and Rocky Linux 8 and later.
To run DPF on CentOS 7, use DPF for 2024R2 (8.2) or older.
To run DPF on CentOS 7, use DPF for 2024 R2 (8.2) or later.
For more information, see `Ansys Platform Support <https://www.ansys.com/solutions/solutions-by-role/it-professionals/platform-support>`_.

Client-server
Expand All @@ -23,8 +23,8 @@ version.

As new features are developed, every attempt is made to ensure backward
compatibility from the client to the server. Backward compatibility is generally ensured for
the 4 latest Ansys versions. For example, ``ansys-dpf-core`` module with 0.8.0 version has been
developed for Ansys 2023 R2 pre1 release, for 2023 R2 Ansys version. It is compatible with
the four latest Ansys versions. For example, the ``ansys-dpf-core`` module 0.8.0 has been
developed for the Ansys 2023 R2 version. It is compatible with
2023 R2, 2023 R1, 2022 R2 and 2022 R1 Ansys versions.

Starting with version ``0.10`` of ``ansys-dpf-core``, the packages ``ansys-dpf-gate``,
Expand All @@ -34,8 +34,8 @@ and prevent synchronization issues between the PyDPF libraries, requiring to dro
previous to 2022 R2.

**Ansys strongly encourages you to use the latest packages available**, as far they are compatible
with the Server version you want to run. Considering Ansys 2023 R1 for example, if ``ansys-dpf-core``
module with 0.10.0 version is the latest available compatible package, it should be used.
with the server version you want to run. Considering Ansys 2023 R1 for example, if ``ansys-dpf-core``
module 0.10.0 is the latest available compatible package, it should be used.

For ``ansys-dpf-core<0.10``, the `ansys.grpc.dpf <https://pypi.org/project/ansys-grpc-dpf/>`_
package should also be synchronized with the server version.
Expand Down
17 changes: 9 additions & 8 deletions doc/source/getting_started/dpf_server.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,12 @@ simulation workflow.
The DPF Server is packaged within the **Ansys installer** in Ansys 2021 R1 and later.

It is also available as a standalone package that contains all the necessary files to run, enabling DPF capabilities.
The standalone DPF Server is available on the `DPF Pre-Release page <https://download.ansys.com/Others/DPF%20Pre-Release>`_ of the Ansys Customer Portal.
The standalone DPF Server is available on the `DPF Pre-Release page <https://download-archive.ansys.com/Others/DPF%20Pre-Release>`_ of the Ansys Customer Portal.
The first standalone version of DPF Server available is 6.0 (2023 R2).

The sections on this page describe how to install and use a standalone DPF Server.

* For a quick start on using PyDPF, see :ref:`ref_getting_started`.
* For a brief overview on using PyDPF, see :ref:`ref_getting_started`.
* For more information on DPF and its use, see :ref:`ref_user_guide`.


Expand Down Expand Up @@ -65,7 +65,7 @@ PyDPF-Core is a Python client API communicating with a **DPF Server**, either
through the network using gRPC or directly in the same process. PyDPF-Post is a Python
module for postprocessing based on PyDPF-Core.

Both PyDPF-Core and PyDPF-Post can be used with DPF Server. Installation instructions
Both PyDPF-Core and PyDPF-Post can be used with the DPF Server. Installation instructions
for PyDPF-Core are available in the PyDPF-Core `Getting started <https://dpf.docs.pyansys.com/version/stable/getting_started/install.html>`_.
Installation instructions for PyDPF-Post are available in the PyDPF-Post `Getting started <https://post.docs.pyansys.com/version/stable/getting_started/install.html>`_.

Expand Down Expand Up @@ -98,10 +98,10 @@ to use thanks to its ``ansys_path`` argument.
PyDPF otherwise follows the logic below to automatically detect and choose which locally installed
version of DPF Server to run:

- it uses the ``ANSYS_DPF_PATH`` environment variable in priority if set and targeting a valid path to a DPF Server installation.
- it then checks the currently active Python environment for any installed standalone DPF Server, and uses the latest version available.
- it then checks for ``AWP_ROOTXXX`` environment variables, which are set by the **Ansys installer**, and uses the latest version available.
- if then raises an error if all of the steps above failed to return a valid path to a DPF Server installation.
- It uses the ``ANSYS_DPF_PATH`` environment variable in priority if set and targeting a valid path to a DPF Server installation.
- It then checks the currently active Python environment for any installed standalone DPF Server, and uses the latest version available.
- It then checks for ``AWP_ROOTXXX`` environment variables, which are set by the **Ansys installer**, and uses the latest version available.
- It then raises an error if all of the preceding steps failed to return a valid path to a DPF Server installation.

Run DPF Server in a Docker container
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand All @@ -111,7 +111,8 @@ DPF Server can be run in a Docker container.
in :ref:`Install DPF Server <target_installing_server>`, download the ``Dockerfile`` file.
#. Optional: download any other plugin ZIP file as appropriate. For example, to access the ``composites`` plugin for Linux,
download ``ansys_dpf_composites_lin_v2025.1.pre0.zip``.
#. Copy all the ZIP files and ``Dockerfile`` file in a folder and navigate into that folder.
#. Copy all the ZIP files and the ``Dockerfile`` file into a folder.
#. Navigate into the folder used in the previous step.
#. To build the DPF Docker container, run the following command:

.. code::
Expand Down
2 changes: 1 addition & 1 deletion doc/source/getting_started/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Getting started

The Data Processing Framework (DPF) provides numerical simulation users and engineers with a toolbox
for accessing and transforming simulation data. DPF can access data from Ansys solver
result files as well as from several neutral (see :ref:`ref_main_index`).
result files as well as from several neutral file formats. For more information, see :ref:`ref_main_index`.

This **workflow-based** framework allows you to perform complex preprocessing and
postprocessing operations on large amounts of simulation data.
Expand Down
10 changes: 5 additions & 5 deletions doc/source/getting_started/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@ with this command:
pip install ansys-dpf-core
PyDPF-Core plotting capabilities require to have `PyVista <https://pyvista.org/>`_ installed.
To install PyDPF-Core with its optional plotting functionalities, use:
PyDPF-Core plotting capabilities require you to have `PyVista <https://pyvista.org/>`_ installed.
To install PyDPF-Core with its optional plotting functionalities, run this command:

.. code::
Expand Down Expand Up @@ -62,10 +62,10 @@ then use the following command from within this local directory:
pip install --no-index --find-links=. ansys-dpf-core
Beware that PyDPF-Core wheelhouses do not include the optional plotting dependencies.
To allow for plotting capabilities, also download the wheels corresponding to your platform and Python interpreter version
Note that PyDPF-Core wheelhouses do not include the optional plotting dependencies.
To use the plotting capabilities, also download the wheels corresponding to your platform and Python interpreter version
for `PyVista <https://pypi.org/project/pyvista/#files>`_ and
`matplotlib <https://pypi.org/project/matplotlib/#files>`_, then place them in the same previous local directory and run the command above.
`matplotlib <https://pypi.org/project/matplotlib/#files>`_. Then, place them in the same local directory and run the preceding command.


Install in development mode
Expand Down
29 changes: 14 additions & 15 deletions doc/source/getting_started/licensing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,10 @@
Licensing
=========

This section details how to properly set up licensing, as well as what the user should expect in
terms of limitations or license usage when running PyDPF scripts.
This section describes how to properly set up licensing, as well as limitations and license usage when running PyDPF scripts.

DPF follows a client-server architecture, which means that the PyDPF client library must interact with a running DPF Server.
It either starts a DPF Server via a local installation of DPF Server, or it connects to an already running local or remote DPF Server.
DPF follows a client-server architecture, so the PyDPF client library must interact with a running DPF Server.
It either starts a DPF Server via a local DPF Server installation, or it connects to an already running local or remote DPF Server.

DPF Server is packaged within the **Ansys installer** in Ansys 2021 R1 and later.
It is also available as a standalone application.
Expand All @@ -20,12 +19,12 @@ For more information on installing DPF Server, see :ref:`ref_dpf_server`.
License terms
-------------

When using the DPF Server from an Ansys installation, the user has already agreed to the licensing
When using the DPF Server from an Ansys installation, you have already agreed to the licensing
terms when installing Ansys.

When using a standalone DPF Server, the user must accept the ``DPF Preview License Agreement``
When using a standalone DPF Server, you must accept the ``DPF Preview License Agreement``
by following the indications below.
Starting a DPF Server without agreeing to the ``DPF Preview License Agreement`` throws an exception.
Starting a DPF Server without agreeing to the ``DPF Preview License Agreement`` creates an exception.

DPF Preview License Agreement
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand All @@ -51,7 +50,7 @@ existing license for the edition and version of DPF Server that you intend to us
Configure licensing
-------------------

If your machine does not have a local Ansys installation, you need to define where DPF should look for a valid license.
If your machine does not have a local Ansys installation, you must define where DPF should look for a valid license.

To use a local license file, set the ``ANSYSLMD_LICENSE_FILE`` environment
variable to point to an Ansys license file ``<license_file_to_use>``:
Expand Down Expand Up @@ -85,12 +84,12 @@ License checks and usage
------------------------

Some DPF operators require DPF to check for an existing license
and some require DPF to check-out a compatible license increment.
and some require DPF to checkout a compatible license increment.

DPF is by default allowed to check-out license increments as needed.
DPF is by default allowed to checkout license increments as needed.
To change this behavior, see :ref:`here <licensing_server_context>`.

To know if operators require a license increment check-out to run, check their ``license``
To know if operators require a license increment checkout to run, check their ``license``
attribute in :ref:`ref_dpf_operators_reference` or directly in Python by checking the operator's
properties for a ``license`` key:

Expand All @@ -109,13 +108,13 @@ properties for a ``license`` key:
To check which Ansys licensing increments correspond to ``any_dpf_supported_increments``,
see :ref:`here<target_to_ansys_license_increments_list>`.
see :ref:`Compatible Ansys license increments<target_to_ansys_license_increments_list>`.

Even if an operator does not require a license check-out to run, most DPF operators still require
Even if an operator does not require a license checkout to run, most DPF operators still require
DPF to check for a reachable license server or license file.

Operators which do not perform any kind of license check are source operators (data extraction
operators) which do not perform any data transformation.
Operators that do not perform any kind of license check are source operators (data extraction
operators). These operators do not perform any data transformation.

For example, when considering result operators, they perform data transformation if the requested
location is not the native result location. In that case, averaging occurs which is considered
Expand Down
21 changes: 10 additions & 11 deletions doc/source/operator_reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,7 @@
Operators
=========

DPF operators provide for manipulating and transforming simulation data.

From DPF Server for Ansys 2023 R2 and later, the licensing logic for operators in DPF depend on the active
`ServerContext <https://dpf.docs.pyansys.com/api/ansys.dpf.core.server_context.html#servercontext>`_.

The available contexts are **Premium** and **Entry**.
Licensed operators are marked as in the documentation using the ``license`` property.
Operators with the ``license`` property as **None** do not require a license check-out.
For more information about using these two contexts, see :ref:`user_guide_server_context`.
Click below to access the operators documentation.
DPF operators allow you to manipulate and transform simulation data.

.. grid:: 1

Expand All @@ -32,9 +23,17 @@ Click below to access the operators documentation.
:click-parent:


For Ansys 2023 R2 and later, the DPF Server licensing logic for operators in DPF depends on the active
`server context <https://dpf.docs.pyansys.com/version/stable/api/ansys.dpf.core.server_context.html#ansys.dpf.core.server_context.ServerContext>`_.

The available contexts are **Premium** and **Entry**.
Licensed operators are marked as such in the documentation using the ``license`` property.
Operators with the ``license`` property set to **None** do not require a license checkout.
For more information on using these two contexts, see :ref:`user_guide_server_context`.

.. note::

For Ansys 2023 R1 and earlier, the context is equivalent to Premium, with all operators loaded.
For Ansys 2023 R1 and earlier, the context is equivalent to **Premium**, with all operators loaded.
For DPF Server 2023.2.pre0 specifically, the server context defines which operators are loaded and
accessible. Use the `PyDPF-Core 0.7 operator documentation <https://dpf.docs.pyansys.com/version/0.7/operator_reference.html>`_ to learn more.
Some operators in the documentation might not be available for a particular server version.
22 changes: 11 additions & 11 deletions doc/source/user_guide/concepts/concepts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,33 +3,33 @@
==================
Terms and concepts
==================
DPF sees **fields of data**, not physical results. This makes DPF a
DPF uses **fields of data**, not physical results. This makes DPF a
very versatile tool that can be used across teams, projects, and
simulations.

Key terms
---------
Here are descriptions for key DPF terms:

- **Data source:** One or more files containing analysis results.
- **Field:** Main simulation data container.
- **Field container:** For a transient, harmonic, modal, or multi-step
- **Data source**: One or more files containing analysis results.
- **Field**: Main simulation data container.
- **Fields container**: For a transient, harmonic, modal, or multi-step
static analysis, a set of fields, with one field for each time step
or frequency.
- **Location:** Type of topology associated with the data container. DPF
- **Location**: Type of topology associated with the data container. DPF
uses three different spatial locations for finite element data: ``Nodal``,
``Elemental``, and ``ElementalNodal``.
- **Operators:** Objects that are used to create, transform, and stream the data.
- **Operators**: Objects that are used to create, transform, and stream the data.
An operator is composed of a **core** and **pins**. The core handles the
calculation, and the pins provide input data to and output data from
the operator.
- **Scoping:** Spatial and/or temporal subset of a model's support.
- **Support:** Physical entity that the field is associated with. For example,
- **Scoping**: Spatial and/or temporal subset of a model's support.
- **Support**: Physical entity that the field is associated with. For example,
the support can be a mesh, geometrical entity, or time or frequency values.
- **Workflow:** Global entity that is used to evaluate the data produced
- **Workflow**: Global entity that is used to evaluate the data produced
by chained operators.
- **Meshed region:** Entity describing a mesh. Node and element scopings,
element types, connectivity (list of node indices composing each element) and
- **Meshed region**: Entity describing a mesh. Node and element scopings,
element types, connectivity (list of node indices composing each element), and
node coordinates are the fundamental entities composing the meshed region.

Scoping
Expand Down
Loading

0 comments on commit a9153eb

Please sign in to comment.