mirror of
https://github.com/m-labs/artiq.git
synced 2024-12-25 03:08:27 +08:00
doc/manual/faq: more minor details
This commit is contained in:
parent
bd2f6ea9f9
commit
2d1bbdf22a
@ -4,7 +4,7 @@ FAQ
|
||||
How do I ...
|
||||
============
|
||||
|
||||
override the `sysclk` frequency of just one dds?
|
||||
override the `sysclk` frequency of just one DDS?
|
||||
------------------------------------------------
|
||||
|
||||
Override the parameter using an argument in the DDB.
|
||||
@ -18,15 +18,14 @@ Names need to be unique.
|
||||
enforce functional dependencies between parameters?
|
||||
---------------------------------------------------
|
||||
|
||||
If you want to override a parameter `b` in the PDB to be `b = 2*a`,
|
||||
If you want to override a parameter ``b`` in the PDB to be ``b = 2*a``,
|
||||
use wrapper experiments, overriding parameters by passing them to the
|
||||
experiment's constructor.
|
||||
|
||||
get rid of `DBKeys`?
|
||||
--------------------
|
||||
get rid of ``DBKeys``?
|
||||
----------------------
|
||||
|
||||
`DBKeys` enforces valid parameter/argument names, references
|
||||
keys in PDB and hints at metadata on how values can be retrieved.
|
||||
``DBKeys`` references keys in PDB, DDB and RDB.
|
||||
|
||||
write a generator feeding a kernel feeding an analyze function?
|
||||
---------------------------------------------------------------
|
||||
@ -49,8 +48,8 @@ write a generator feeding a kernel feeding an analyze function?
|
||||
def do(self, i):
|
||||
return i
|
||||
|
||||
create and use variable lengths arrays?
|
||||
------------------------------------------------
|
||||
create and use variable lengths arrays in kernels?
|
||||
--------------------------------------------------
|
||||
|
||||
Don't. Preallocate everything. Or chunk it and e.g. read 100 events per
|
||||
function call, push them upstream and retry until the gate time closes.
|
||||
@ -58,15 +57,15 @@ function call, push them upstream and retry until the gate time closes.
|
||||
execute multiple slow controller RPCs in parallel without losing time?
|
||||
----------------------------------------------------------------------
|
||||
|
||||
Use `threading.Thread`: portable, fast, simple for one-shot calls.
|
||||
Use ``threading.Thread``: portable, fast, simple for one-shot calls.
|
||||
|
||||
write part of my experiment as a coroutine/asyncio task/generator?
|
||||
------------------------------------------------------------------
|
||||
|
||||
You can not change the API that your experiment exposes: `__init__()`,
|
||||
`build()`, `run()` and `analyze()` need to be regular functions, not
|
||||
You can not change the API that your experiment exposes: ``__init__()``,
|
||||
``build()``, ``run()`` and ``analyze()`` need to be regular functions, not
|
||||
generators or asyncio coroutines. That would make reusing your own code in
|
||||
sub-experiments difficult and fragile. You can however always use the
|
||||
scheduler API to achieve the same (`scheduler.yield(duration=0)`)
|
||||
scheduler API to achieve the same (``scheduler.yield(duration=0)``)
|
||||
or wrap your own generators/coroutines/tasks in regular functions that
|
||||
you then expose as part of the API.
|
||||
|
Loading…
Reference in New Issue
Block a user