Dynamic attributes and typing #258

Closed
opened 2022-04-07 01:55:33 +08:00 by lriesebos · 8 comments
Collaborator

With the current compiler, we make use of dynamic attributes and we can update the kernel invariants set to make them invariant too. For NAC3, attributes need to be explicitly typed, which is good, but I am not sure if we can still use dynamic attributes in that scenario as we might not be able to insert the necessary typing annotations.

So for example DAX.scan uses dynamic attributes. The idea is that we do not know beforehand what the attribute names are. See an example below.

import typing
from artiq.experiment import *


class _ScanItem:
    kernel_invariants: typing.Set[str]

    def __init__(self, **kwargs: typing.Any):
        # Create kernel invariants attribute
        super().__setattr__('kernel_invariants', set())
        # Set all kwargs as attributes
        for k, v in kwargs.items():
            setattr(self, k, v)

    def __setattr__(self, key: str, value: typing.Any) -> None:
        # Set attribute by calling super
        super().__setattr__(key, value)
        # Add attribute to kernel invariants
        self.kernel_invariants.add(key)


class ArtiqTestExperiment(EnvExperiment):

    def build(self):
        self.setattr_device('core')
        items = {'foo': 1, 'bar': 3}  # These items are not known beforehand
        self.scan_item = _ScanItem(**items)

    @kernel
    def run(self):
        print(self.scan_item.foo)
        print(self.scan_item.bar)

This code would work fine and prints

[nix-shell:~/kc705]$ artiq_run repository/artiq_test.py 
1
3

For NAC3, would it still be possible to use dynamically added attributes (which we can not type beforehand) in kernels?

With the current compiler, we make use of dynamic attributes and we can update the kernel invariants set to make them invariant too. For NAC3, attributes need to be explicitly typed, which is good, but I am not sure if we can still use dynamic attributes in that scenario as we might not be able to insert the necessary typing annotations. So for example [DAX.scan](https://gitlab.com/duke-artiq/dax/-/blob/master/dax/base/scan.py#L40) uses dynamic attributes. The idea is that we do not know beforehand what the attribute names are. See an example below. ```python import typing from artiq.experiment import * class _ScanItem: kernel_invariants: typing.Set[str] def __init__(self, **kwargs: typing.Any): # Create kernel invariants attribute super().__setattr__('kernel_invariants', set()) # Set all kwargs as attributes for k, v in kwargs.items(): setattr(self, k, v) def __setattr__(self, key: str, value: typing.Any) -> None: # Set attribute by calling super super().__setattr__(key, value) # Add attribute to kernel invariants self.kernel_invariants.add(key) class ArtiqTestExperiment(EnvExperiment): def build(self): self.setattr_device('core') items = {'foo': 1, 'bar': 3} # These items are not known beforehand self.scan_item = _ScanItem(**items) @kernel def run(self): print(self.scan_item.foo) print(self.scan_item.bar) ``` This code would work fine and prints ``` [nix-shell:~/kc705]$ artiq_run repository/artiq_test.py 1 3 ``` For NAC3, would it still be possible to use dynamically added attributes (which we can not type beforehand) in kernels?
Owner

Sounds difficult - NAC3 has its own parser and uses it to determine the kernel variables of a class, regardless of what the dynamic CPython code did to the class attributes.
What about using a dictionary instead?

Sounds difficult - NAC3 has its own parser and uses it to determine the kernel variables of a class, regardless of what the dynamic CPython code did to the class attributes. What about using a dictionary instead?
Author
Collaborator

Are dictionaries supported in kernels for NAC3? And I don't think the typing will work out for two values with different types then.

Are dictionaries supported in kernels for NAC3? And I don't think the typing will work out for two values with different types then.
Owner

They could be implemented, but all keys would need to have the same type and all values would need to have the same type. Otherwise even simple code such as x = d[y] could not typecheck (what types should be assigned to x and y?).

They could be implemented, but all keys would need to have the same type and all values would need to have the same type. Otherwise even simple code such as ``x = d[y]`` could not typecheck (what types should be assigned to ``x`` and ``y``?).
Contributor

but if you don't know beforehand what the attribute names are, how can you implement the run function? is that generated dynamically?

but if you don't know beforehand what the attribute names are, how can you implement the `run` function? is that generated dynamically?
Author
Collaborator

They could be implemented, but all keys would need to have the same type and all values would need to have the same type. Otherwise even simple code such as x = d[y] could not typecheck (what types should be assigned to x and y?).

The types can be different, so a dict would not solve the problem in this specific scenario I have in mind. We currently do not have the need for dict support in kernels, but I know other users that mentioned they would like to have such a feature. Obviously it would be restricted to have values with a single type.

but if you don't know beforehand what the attribute names are, how can you implement the run function? is that generated dynamically?

This is how our DAX.scan infrastructure currently works, which is a bit based on the idea of setattr_dataset(). See the following example program:

https://gitlab.com/duke-artiq/dax-example/-/blob/master/repository/scan/detection_scan.py

  • Line 21: the key/identifier of the scan variable is defined when adding a scan dimension
  • Line 58: internally, we make a product of all the scan dimensions. the values for each point in the scan are dynamically attributed to a point argument of the run_point() function. kernels can access the value by calling it as an attribute of point. This allows the program to be compiled by ARTIQ.

So this is obviously a problem for typing the internal list of Point objects as well as the point argument to the run_point() function. Currently it fully relies on the type inferencer.

I guess one solution is to force the user to type this to make this "static" again. Maybe something like

class MyPoint(dax.scan.Point):
    t: KernelInvariant[float]

class MyScanExperiment(DaxScan, EnvExperiment):
    DAX_SCAN_POINT_TYPE = MyPoint  # Pass the custom class to DAX.scan
    _dax_scan_points: KernelInvariant[list[MyPoint]]  # Type the internal scan point list
    
    ...
    
    def run_point(point: MyPoint):  # Type the point argument
        ...

Another direction could be supporting more dynamic typing by using typing.cast(), but I can imagine that is a can of worms we do not want to open.

If you have any better ideas to solve this statically or dynamically, let me know.

> They could be implemented, but all keys would need to have the same type and all values would need to have the same type. Otherwise even simple code such as x = d[y] could not typecheck (what types should be assigned to x and y?). The types can be different, so a dict would not solve the problem in this specific scenario I have in mind. We currently do not have the need for dict support in kernels, but I know other users that mentioned they would like to have such a feature. Obviously it would be restricted to have values with a single type. > but if you don't know beforehand what the attribute names are, how can you implement the run function? is that generated dynamically? This is how our DAX.scan infrastructure currently works, which is a bit based on the idea of `setattr_dataset()`. See the following example program: https://gitlab.com/duke-artiq/dax-example/-/blob/master/repository/scan/detection_scan.py - Line 21: the key/identifier of the scan variable is defined when adding a scan dimension - Line 58: internally, we make a product of all the scan dimensions. the values for each point in the scan are dynamically attributed to a `point` argument of the `run_point()` function. kernels can access the value by calling it as an attribute of `point`. This allows the program to be compiled by ARTIQ. So this is obviously a problem for typing the internal list of `Point` objects as well as the `point` argument to the `run_point()` function. Currently it fully relies on the type inferencer. I guess one solution is to force the user to type this to make this "static" again. Maybe something like ```python class MyPoint(dax.scan.Point): t: KernelInvariant[float] class MyScanExperiment(DaxScan, EnvExperiment): DAX_SCAN_POINT_TYPE = MyPoint # Pass the custom class to DAX.scan _dax_scan_points: KernelInvariant[list[MyPoint]] # Type the internal scan point list ... def run_point(point: MyPoint): # Type the point argument ... ``` Another direction could be supporting more dynamic typing by using `typing.cast()`, but I can imagine that is a can of worms we do not want to open. If you have any better ideas to solve this statically or dynamically, let me know.
Owner

I guess one solution is to force the user to type this to make this "static" again.

Sounds fine and should be supported already, so I guess we can close this.

> I guess one solution is to force the user to type this to make this "static" again. Sounds fine and should be supported already, so I guess we can close this.
sb10q closed this issue 2022-04-08 11:05:48 +08:00
Contributor

This is a major problem for testing NAC3 here in practice, as we similarly rely on ndscan to centrally provide the experiment execution logic in a generic fashion (variable numbers and types of arguments and results). It turns out that ndscan actually uses methods generated via kernel_from_string() to fetch parameter values (scan points) from the host at runtime using a _kscan_param_values_chunk() RPC call for various reasons, but that runs into the same issue of needing to modify the annotations on these on the host at runtime.

I believe we do need to find a solution for this, as it is pretty clear that abstracting out the experiement execution logic is a useful and commonly desired thing to do, to the point where a number of different such libraries have already sprung up.

This is a major problem for testing NAC3 here in practice, as we similarly rely on ndscan to centrally provide the experiment execution logic in a generic fashion (variable numbers and types of arguments and results). It turns out that ndscan actually uses methods generated via `kernel_from_string()` to fetch parameter values (scan points) from the host at runtime using a `_kscan_param_values_chunk()` RPC call for various reasons, but that runs into the same issue of needing to modify the annotations on these on the host at runtime. I believe we do need to find a solution for this, as it is pretty clear that abstracting out the experiement execution logic is a useful and commonly desired thing to do, to the point where a number of different such libraries have already sprung up.
Owner

Is moving all the dynamic programming into the CPython library as suggested by the MyPoint example not a good solution?

There's always the option of static code generators i.e. something that generates actual Python files that you can pass to ARTIQ...

Is moving all the dynamic programming into the CPython library as suggested by the ``MyPoint`` example not a good solution? There's always the option of static code generators i.e. something that generates actual Python files that you can pass to ARTIQ...
Sign in to join this conversation.
No Milestone
No Assignees
4 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: M-Labs/nac3#258
No description provided.