From e4b4657a6d0e2601d6352de5a7f113348756c515 Mon Sep 17 00:00:00 2001 From: architeuthidae Date: Tue, 13 Aug 2024 12:41:58 +0800 Subject: [PATCH] doc: using_data_interfaces associated changes --- doc/manual/default_network_ports.rst | 4 +-- doc/manual/environment.rst | 16 ++++++---- doc/manual/faq.rst | 11 ++++++- doc/manual/getting_started_core.rst | 6 ++-- doc/manual/getting_started_mgmt.rst | 48 ++++++++++++++++++++++++++-- doc/manual/management_system.rst | 31 +++++++++++------- 6 files changed, 90 insertions(+), 26 deletions(-) diff --git a/doc/manual/default_network_ports.rst b/doc/manual/default_network_ports.rst index ddd8f0712..b3df35961 100644 --- a/doc/manual/default_network_ports.rst +++ b/doc/manual/default_network_ports.rst @@ -10,9 +10,9 @@ Default network ports +---------------------------------+--------------+ | Core device (analyzer) | 1382 | +---------------------------------+--------------+ -| Moninj (core device or proxy) | 1383 | +| MonInj (core device or proxy) | 1383 | +---------------------------------+--------------+ -| Moninj (proxy control) | 1384 | +| MonInj (proxy control) | 1384 | +---------------------------------+--------------+ | Core analyzer proxy (proxy) | 1385 | +---------------------------------+--------------+ diff --git a/doc/manual/environment.rst b/doc/manual/environment.rst index 9f01526e5..e2fd4d9ef 100644 --- a/doc/manual/environment.rst +++ b/doc/manual/environment.rst @@ -29,7 +29,7 @@ This is stored in a Python dictionary whose keys are the device names, which the "arguments": {"channel": 19} }, -Note that the key (the name of the device) is ``led`` and the value is itself a Python dictionary. Names will later be used to gain access to a device through methods such as ``self.setattr_device("led")``. While in this case ``led`` can be replaced with another name, provided it is used consistently, some names (e.g. in particular ``core``) are used internally by ARTIQ and will cause problems if changed. It is often more convenient to use aliases for renaming purposes, see below. +Note that the key (the name of the device) is ``led`` and the value is itself a Python dictionary. Names will later be used to gain access to a device through methods such as ``self.setattr_device("led")``. While in this case ``led`` can be replaced with another name, provided it is used consistently, some names (in particular, ``core``) are used internally by ARTIQ and will cause problems if changed. It is often more convenient to use aliases for renaming purposes, see below. .. note:: The device database is generated and stored in the memory of the master when the master is first started. Changes to the ``device_db.py`` file will not immediately affect a running master. In order to update the device database, right-click in the Explorer window and select 'Scan device database', or run the command ``artiq_client scan-devices``. @@ -37,11 +37,11 @@ Note that the key (the name of the device) is ``led`` and the value is itself a .. warning:: It is important to understand that the device database does not *set* your system configuration, only *describe* it. If you change the devices available to your system, it is usually necessary to edit the device database, but editing the database will not change what devices are available to your system. - Remote (normally, non-realtime) devices must have accessible, suitable controllers and drivers; see :doc:`developing_a_ndsp` for more information, including how to add entries for new remote devices to your device database. Local devices (normally, realtime, e.g. your Sinara hardware) must be factually attached to your system, and more importantly, your gateware and firmware must have been compiled to account for them, and to expect them at those ports. + Remote (normally, non-realtime) devices must have accessible, suitable controllers and drivers; see :doc:`developing_a_ndsp` for more information, including how to add entries for new remote devices to your device database. Local devices (normally, realtime, e.g. your Sinara hardware) must be connected to your system, and more importantly, your gateware and firmware must have been compiled to account for them, and to expect them at those ports. - While controllers can be added and removed to your device database relatively easily, in order to make new real-time hardware accessible, it is generally also necessary to recompile and reflash your gateware and firmware. (If you purchase your hardware from M-Labs, you will normally be provided with new binaries and necessary assistance.) + While controllers can be added and removed to your device database relatively easily, in order to make new real-time hardware accessible, it is generally also necessary to recompile and reflash your gateware and firmware. (If you purchase your hardware from M-Labs, you will be provided with new binaries and necessary assistance.) See :doc:`building_developing`. - Adding or removing new real-time hardware is a difference in *system configuration,* which must be specified at compilation time of gateware and firmware. For Kasli and Kasli-SoC, this is managed in the form of a JSON usually called the *system description* file. The device database generally provides that information to ARTIQ which can change from instance to instance ARTIQ is run, e.g., device names and aliases, network addresses, clock frequencies, and so on. The system configuration defines that information which is *not* permitted to change, e.g., what device is associated with which EEM port or RTIO channels. Insofar as data is duplicated between the two, the device database is obliged to agree with the system description, not the other way around. + Adding or removing new real-time hardware is a difference in *system configuration,* which must be specified at compilation time of gateware and firmware. For Kasli and Kasli-SoC, this is managed in the form of a JSON usually called the :ref:`system description file`. The device database generally provides that information to ARTIQ which can change from instance to instance ARTIQ is run, e.g., device names and aliases, network addresses, clock frequencies, and so on. The system configuration defines that information which is *not* permitted to change, e.g., what device is associated with which EEM port or RTIO channels. Insofar as data is duplicated between the two, the device database is obliged to agree with the system description, not the other way around. If you obtain your hardware from M-Labs, you will always be provided with a ``device_db.py`` to match your system configuration, which you can edit as necessary to add remote devices, aliases, and so on. In the relatively unlikely case that you are writing a device database from scratch, the :mod:`~artiq.frontend.artiq_ddb_template` utility can be used to generate a template device database directly from the JSON system description used to compile your gateware and firmware. This is the easiest way to ensure that details such as the allocation of RTIO channel numbers will be represented in the device database correctly. See also the corresponding entry in :ref:`Utilities `. @@ -73,18 +73,20 @@ If an entry is a string, that string is used as a key for another lookup in the Arguments --------- -Arguments are values that parameterize the behavior of an experiment. ARTIQ supports both interactive arguments, requested and supplied at some point while an experiment is running, and submission-time arguments, requested in the build phase and set before the experiment is executed. For more on arguments in practice, see the tutorial section :ref:`mgmt-arguments`. For supported argument types and specific reference, see the relevant sections of :doc:`the core language reference `, as well as the example experiment ``examples/no_hardware/interactive.py``. +Arguments are values that parameterize the behavior of an experiment. ARTIQ supports both interactive arguments, requested and supplied at some point while an experiment is running, and submission-time arguments, requested in the build phase and set before the experiment is executed. For more on arguments in practice, see the tutorial section :ref:`mgmt-arguments`. For supported argument types, see the reference for :mod:`artiq.language.environment`; for specific methods, see the reference for :class:`~artiq.language.environment.HasEnvironment`. + +.. _environment-datasets: Datasets -------- -Datasets are values that are read and written by experiments kept in a key-value store. They exist to facilitate the exchange and preservation of information between experiments, from experiments to the management system, and from experiments to long-term storage. Datasets may be either scalars (``bool``, ``int``, ``float``, or NumPy scalar) or NumPy arrays. For basic use of datasets, see the :ref:`management system tutorial `. +Datasets are values that are read and written by experiments kept in a key-value store. They exist to facilitate the exchange and preservation of information between experiments, from experiments to the management system, and from experiments to long-term storage. Datasets may be either scalars (``bool``, ``int``, ``float``, or NumPy scalar) or NumPy arrays. For basic use of datasets, see the :ref:`interactivity tutorial `. A dataset may be broadcast (``broadcast=True``), that is, distributed to all clients connected to the master. This is useful e.g. for the ARTIQ dashboard to plot results while an experiment is in progress and give rapid feedback to the user. Broadcasted datasets live in a global key-value store owned by the master. Care should be taken that experiments use distinctive real-time result names in order to avoid conflicts. Broadcasted datasets may be used to communicate values across experiments; for instance, a periodic calibration experiment might update a dataset read by payload experiments. Broadcasted datasets are replaced when a new dataset with the same key (name) is produced. By default, they are erased when the master halts. Broadcasted datasets may be made persistent (``persistent=True``, which also implies ``broadcast=True``), in which case the master stores them in a LMDB database typically called ``dataset_db.mdb``, where they are saved across master restarts. -By default, datasets are archived in the HDF5 output for that run, although this can be opted against (``archive=False``). +By default, datasets are archived in the ``results`` HDF5 output for that run, although this can be opted against (``archive=False``). Datasets and units ^^^^^^^^^^^^^^^^^^ diff --git a/doc/manual/faq.rst b/doc/manual/faq.rst index 3b90989cc..249992e79 100644 --- a/doc/manual/faq.rst +++ b/doc/manual/faq.rst @@ -8,7 +8,7 @@ The content of this manual is arranged in rough reading order. If you start at t **If you are just starting out,** and would like to get ARTIQ set up on your computer and your core device, start with :doc:`installing`, :doc:`flashing`, and :doc:`configuring`, in that order. -**If you have a working ARTIQ setup** (or someone else has set it up for you), start with the tutorials: read :doc:`rtio`, then progress to :doc:`getting_started_core` and :doc:`getting_started_mgmt`. If your system is in a DRTIO configuration, :doc:`DRTIO and subkernels ` will also be helpful. +**If you have a working ARTIQ setup** (or someone else has set it up for you), start with the tutorials: read :doc:`rtio`, then progress to :doc:`getting_started_core`, :doc:`getting_started_mgmt`, and :doc:`using_data_interfaces`. If your system is in a DRTIO configuration, :doc:`DRTIO and subkernels ` will also be helpful. Pages like :doc:`management_system` and :doc:`core_device` describe **specific components of the ARTIQ ecosystem** in more detail. If you want to understand more about device and dataset databases, for example, read the :doc:`environment` page; if you want to understand the ARTIQ Python dialect and everything it does or does not support, read the :doc:`compiler` page. @@ -106,6 +106,15 @@ Copy the ``examples`` folder from that path into your home or user directory, an On the other hand, if you have progressed past this level and would like to see more in-depth code or real-life examples of how other groups have handled running experiments with ARTIQ, see the "Community code" directory on the M-labs `resources page `_. +fix ``failed to connect to moninj`` in the dashboard? +----------------------------------------------------- + +This and other similar messages almost always indicate that your device database lists controllers (for example, ``aqctl_moninj_proxy``) that either haven't been started or aren't reachable at the given host and port. See :ref:`mgmt-ctlmgr`, or navigate to the directory containing your ``device_db.py`` and run: :: + + $ artiq_ctlgmr + +to let the controller manager start the necessary controllers automatically. + diagnose and fix sequence errors? --------------------------------- diff --git a/doc/manual/getting_started_core.rst b/doc/manual/getting_started_core.rst index d782e9fd5..080ac86b4 100644 --- a/doc/manual/getting_started_core.rst +++ b/doc/manual/getting_started_core.rst @@ -1,5 +1,5 @@ -Getting started with the core language -====================================== +Getting started with the core device +==================================== As a very first step, we will turn on a LED on the core device. Create a file ``led.py`` containing the following: :: @@ -207,7 +207,7 @@ The core device records the real-time I/O waveforms into a circular buffer. It i rtio_log("ttl0", "i", i) delay(...) -When using :mod:`~artiq.frontend.artiq_run`, the recorded data can be extracted using :mod:`~artiq.frontend.artiq_coreanalyzer`. To export it to VCD, which can be viewed using third-party tools such as GtkWave, use the command ``artiq_coreanalyzer -w rtio.vcd``. Recorded data can also be viewed directly with the ARTIQ dashboard, which will be presented later in :doc:`getting_started_mgmt`. +When using :mod:`~artiq.frontend.artiq_run`, the recorded data can be extracted using :mod:`~artiq.frontend.artiq_coreanalyzer`. To export it to VCD, which can be viewed using third-party tools such as GtkWave, use a command in the form of ``artiq_coreanalyzer -w .vcd``. Recorded data can also be viewed directly with the ARTIQ dashboard, which will be presented later in :doc:`getting_started_mgmt`. .. _getting-started-dma: diff --git a/doc/manual/getting_started_mgmt.rst b/doc/manual/getting_started_mgmt.rst index 0a31a1227..5099c1d79 100644 --- a/doc/manual/getting_started_mgmt.rst +++ b/doc/manual/getting_started_mgmt.rst @@ -69,7 +69,7 @@ In other words, a worker created by the master has executed the experiment, and Both IPv4 and IPv6 are supported. See also the individual references :mod:`~artiq.frontend.artiq_master`, :mod:`~artiq.frontend.artiq_dashboard`, and :mod:`~artiq.frontend.artiq_client` for more details. -You may also notice that the master has created some other organizational files in its home directory, notably a folder ``results``, where a HDF5 record is preserved of every experiment that is submitted and run. The files in ``results`` will be discussed in greater detail in :doc:`using_interactivity`. +You may also notice that the master has created some other organizational files in its home directory, notably a folder ``results``, where a HDF5 record is preserved of every experiment that is submitted and run. The files in ``results`` will be discussed in greater detail in :doc:`using_data_interfaces`. Running the dashboard and controller manager -------------------------------------------- @@ -83,7 +83,7 @@ First, start the controller manager: :: Like the master, this command should not return, as the controller manager keeps running. Note that the controller manager requires access to the device database, but not in the local directory -- it gets that access automatically by connecting to the master. .. note:: - We will not be using controllers in this part of the tutorial. Nonetheless, the dashboard will expect to be able to contact certain controllers given in the device database, and print error messages if this isn't the case (e.g. ``Is aqctl_moninj_proxy running?``). It is equally possible to check your device database and start the requisite controllers manually, or to temporarily delete their entries from ``device_db.py``, but it's normally quite convenient to let the controller manager handle things. The role and use of controller managers will be covered in more detail in :doc:`using_interactivity`. + We will not be using controllers in this part of the tutorial. Nonetheless, the dashboard will expect to be able to contact certain controllers given in the device database, and print error messages if this isn't the case (e.g. ``Is aqctl_moninj_proxy running?``). It is equally possible to check your device database and start the requisite controllers manually, or to temporarily delete their entries from ``device_db.py``, but it's normally quite convenient to let the controller manager handle things. The role and use of controller managers will be covered in more detail in :doc:`using_data_interfaces`. In a third terminal, start the dashboard: :: @@ -128,6 +128,50 @@ If you switch the 'Log' dock to its 'Schedule' tab while the experiment is still In the meantime, you can try out submitting either of the two experiments with different priority levels and take a look at the queues that ensue. If you are interested, you can try submitting experiments through the command line client at the same time, or even open a second dashboard in a different terminal. Observe that no matter the source, all submitted experiments will be accounted for and handled by the scheduler in an orderly way. +.. _mgmt-arguments: + +Adding arguments +---------------- + +Experiments may have arguments, values which can be set in the dashboard on submission and used in the experiment's code. Create a new experiment called ``argument_tutorial.py``, and give it the following :meth:`~artiq.language.environment.HasEnvironment.build` and :meth:`~artiq.language.environment.Experiment.run` functions: :: + + def build(self): + self.setattr_argument("count", NumberValue(precision=0, step=1)) + + def run(self): + for i in range(self.count): + print("Hello World", i) + +The method :meth:`~artiq.language.environment.HasEnvironment.setattr_argument` acts to set the argument and make its value accessible, similar to the effect of :meth:`~artiq.language.environment.HasEnvironment.setattr_device`. The second input sets the type of the argument; here, :class:`~artiq.language.environment.NumberValue` represents a floating point numerical value. To learn what other types are supported, see :class:`artiq.language.environment` and :class:`artiq.language.scan`. + +Rescan the repository as before. Open the new experiment in the dashboard. Above the submission options, you should now see a spin box that allows you to set the value of ``count``. Try setting it and submitting it. + +Interactive arguments +--------------------- + +With standard arguments, it is only possible to use :meth:`~artiq.language.environment.HasEnvironment.setattr_argument` in :meth:`~artiq.language.environment.HasEnvironment.build`; these arguments are always requested at submission time. However, it is also possible to use *interactive* arguments, which can be requested and supplied inside :meth:`~artiq.language.environment.Experiment.run`, while the experiment is being executed. Modify the experiment as follows (and push the result): :: + + def build(self): + pass + + def run(self): + repeat = True + while repeat: + print("Hello World") + with self.interactive(title="Repeat?") as interactive: + interactive.setattr_argument("repeat", BooleanValue(True)) + repeat = interactive.repeat + +Close and reopen the submission window, or click on the button labeled 'Recompute all arguments', in order to update the submission parameters. Submit again. It should print once, then wait; you may notice in 'Schedule' that the experiment does not exit, but hangs at status 'running'. + +Now, in the same dock as 'Explorer', navigate to the tab 'Interactive Args'. You can now choose and submit a value for 'repeat'. Every time an interactive argument is requested, the experiment pauses until an input is supplied. + +.. note:: + If you choose to 'Cancel' instead, an :exc:`~artiq.language.environment.CancelledArgsError` will be raised (which an experiment can catch, instead of halting). + +In order to request and supply multiple interactive arguments at once, simply place them in the same ``with`` block; see also the example ``interactive.py`` in ``examples/no_hardware``. + + .. _master-setting-up-git: Setting up Git integration diff --git a/doc/manual/management_system.rst b/doc/manual/management_system.rst index dc807b9dd..28c2c0a1f 100644 --- a/doc/manual/management_system.rst +++ b/doc/manual/management_system.rst @@ -12,20 +12,31 @@ Master The :ref:`ARTIQ master ` is responsible for managing the parameter and device databases, the experiment repository, scheduling and running experiments, archiving results, and distributing real-time results. It is a headless component, and one or several clients (command-line or GUI) use the network to interact with it. -It should not be confused with the 'master' device in a DRTIO system, which is only a designation for the particular core device acting as central node in a distributed configuration of ARTIQ. The two concepts are otherwise unrelated. +The master expects to be given a directory on startup, the experiment repository, containing these experiments which are automatically tracked and communicated to clients. By default, it simply looks for a directory called ``repository``. The ``-r`` flag can be used to substitute an alternate location. -Command-line client -^^^^^^^^^^^^^^^^^^^ +It also expects access to a ``device_db.py``, with a corresponding flag ``--device-db`` to substitute a different file name. Additionally, it will reference or create certain files in the directory it is run in, among them ``dataset_db.mdb``, the LMDB database containing persistent datasets, ``last_rid.pyon``, which simply contains the last used RID, and the ``results`` directory. -The :ref:`command-line client ` connects to the master and permits modification and monitoring of the databases, monitoring the experiment schedule and log, and submitting experiments. +.. note:: + Because the other parts of the management system all seem to be able to access the information stored in these files, confusion can sometimes result about where it is really stored and how it is distributed. Device databases, datasets, results, and experiments are all solely kept and administered by the master, which communicates information to dashboards, browsers, and clients over the network whenever necessary. -Dashboard -^^^^^^^^^ + Notably, clients and dashboards do not send in experiments to the master; they request them from the array of experiments the master knows about, primarily those in ``repository``, but also in the master's local file system, if 'Open file outside repository' is selected. This is true even if ``repository`` is configured as a Git repository and cloned on other machines. -The :ref:`dashboard ` connects to the master and is the main method of interacting with it. The main features of the dashboard are scheduling of experiments, setting of their arguments, examining the schedule, displaying real-time results, and debugging TTL and DDS channels in real time. +The ARTIQ master should not be confused with the 'master' device in a DRTIO system, which is only a designation for the particular core device acting as central node in a distributed configuration of ARTIQ. The two concepts are otherwise unrelated. + +Clients +^^^^^^^ + +The :ref:`command-line client ` connects to the master and permits modification and monitoring of the databases, reading the experiment schedule and log, and submitting experiments. + +The :ref:`dashboard ` connects to the master and is the main method of interacting with it. The main roles of the dashboard are scheduling of experiments, setting of their arguments, examining the schedule, displaying real-time results, and debugging TTL and DDS channels in real time. The dashboard remembers and restores GUI state (window/dock positions, last values entered by the user, etc.) in between instances. This information is stored in a file called ``artiq_dashboard_{server}_{port}.pyon`` in the configuration directory (e.g. generally ``~/.config/artiq`` for Unix, same as data directory for Windows), distinguished in subfolders by ARTIQ version. +Browser +^^^^^^^ + +The :ref:`browser ` is used to read ARTIQ ``results`` HDF5 files and run experiment :meth:`~artiq.language.environment.Experiment.analyze` functions, in particular to retrieve previous result databases, process them, and display them in ARTIQ applets. The browser also remembers and restores its GUI state; this is stored in a file called simply ``artiq_browser``, kept in the same configuration directory as the dashboard. + Controller manager ^^^^^^^^^^^^^^^^^^ @@ -36,16 +47,14 @@ A controller manager connects to the master and accesses the device database thr Git integration --------------- -The master may use a Git repository to store experiment source code. Using Git has many advantages. For example, each result file (HDF5) contains the commit ID corresponding to the exact source code it was produced by, which helps reproducibility. - -Although the master also supports non-bare repositories, it is recommended to use a bare repository (e.g. ``git init --bare``) to easily support push transactions from clients. +The master may use a Git repository to store experiment source code. Using Git has many advantages. For example, each result file (HDF5) contains the commit ID corresponding to the exact source code it was produced by, which helps reproducibility. Although the master also supports non-bare repositories, it is recommended to use a bare repository (e.g. ``git init --bare``) to easily support push transactions from clients. You will want Git to notify the master every time the repository is pushed to (e.g. updated), so that the master knows to rescan the repository for new or changed experiments. This is easiest done with the ``post-receive`` hook, as described in :ref:`master-setting-up-git`. .. note:: If you plan to run the ARTIQ system entirely on a single machine, you may also consider using a non-bare repository and the ``post-commit`` hook to trigger repository scans every time you commit changes (locally). In this case, note that the ARTIQ master never uses the repository's working directory, but only what is committed. More precisely, when scanning the repository, it fetches the last (atomically) completed commit at that time of repository scan and checks it out in a temporary folder. This commit ID is used by default when subsequently submitting experiments. There is one temporary folder by commit ID currently referenced in the system, so concurrently running experiments from different repository revisions is fully supported by the master. -By default, the dashboard runs experiments from the repository, whereas the command-line client (``artiq_client submit``) runs experiments from the raw filesystem (which is useful for iterating rapidly without creating many disorganized commits). In order to run from the raw filesystem when using the dashboard, right-click in the Explorer window and select the option "Open file outside repository"; in order to run from the repository when using the command-line client, simply pass the ``-R`` flag. +By default, the dashboard runs experiments from the repository, whereas the command-line client (``artiq_client submit``) runs experiments from the raw filesystem (which is useful for iterating rapidly without creating many disorganized commits). In order to run from the raw filesystem when using the dashboard, right-click in the Explorer window and select the option 'Open file outside repository'. In order to run from the repository when using the command-line client, simply pass the ``-R`` flag. .. _experiment-scheduling: