Note that because we changed exception representation from using string
names as exception identifier into using integer IDs, we need to
initialize the embedding map in order to allocate the integer IDs. Also,
we can no longer print the exception names and messages from the kernel,
we will need the host to map exception IDs to names, and may need the
host to map string IDs to actual strings (messages can be static strings
in the firmware, or strings stored in the host only).
We now check for exception IDs for lit tests, which are fixed because we
preallocated all builtin exceptions.
Exception name is replaced by exception ID, which requires no
allocation. Other strings in the exception can now be 'host-only'
strings, which is represented by a CSlice with len = usize::MAX and
ptr = key, to avoid the need for allocation when raising exceptions
through RPC.
We need to check if our inference reached a fixed point. This is checked
using hash of the types in the AST, which is very slow. This patch
avoids computing the hash if we can make sure that the AST is definitely
changed, which is when we parse a new function.
For some simple programs with many functions, this can significantly
reduce the compile time by up to ~30%.
According to PEP484, type hint can be a string literal for forward
references. With PEP563, type hint would be preserved in annotations in
string form.
Since we don't implement any integer-like operations for TBool
(addition, bitwise not, etc.), TBool is currently neither
strictly equivalent to builtin bool nor numpy.bool_, but through
very obvious compiler errors (operation not supported) rather than
silently different runtime behaviour.
Just mapping both to TBool thus is a huge improvement over the
current behaviour (where numpy.False_ is a true-like object). In
the future, we could still implement more operations for TBool,
presumably following numpy.bool_ rather than the builtin type,
just like builtin integers get translated to the numpy-like
TInt{32,64}.
GitHub: Fixes#1275.
Lists and arrays no longer have the same representation all
the way through codegen, as used to be the case.
This could/should be made more efficient later, eliding the
temporary copies.
Relies on the runtime to provide the necessary
(libm-compatible) functions.
The test is nifty, but a bit brittle; if this breaks in the
future because of optimizer changes, do not hesitate to convert
this into a more pedestrian test case.
This was mistakenly included in fb2b634c4a, and broke the test
case verifying that using None as an ARTIQ type annotation in fact
generates an error message.
With support for polymorphism (or type erasure on pointers to
member functions) being absent in the ARTIQ compiler, code
generation is vital to be able to implement abstractions that
work with user-provided lists/trees of objects with uniform
interfaces (e.g. a common base class, or duck typing), but
different concrete types.
@kernel_from_string has been in production use for exactly
this use case in Oxford for the better part of a year now
(various places in ndscan).
GitHub: Fixes#1089.
async is now a full (non-contextual) keyword.
There are two more instances:
- artiq/frontend/artiq_client.py
- artiq/devices/thorlabs_tcube/driver.py
It is not immediately clear how to fix those, so they are left for
later work.
Before this commit, we had a hack where inferencing would run
one more time, in case we can still infer more attributes even
after all the type information is there. This doesn't work
if *that* round of inferencing brings even more code, e.g.
this is easy to trigger with context managers.
The core of the problem that 373578bc was attempting to solve is
that diagnostics sometimes should be chained; one way of chaining
is the loc.expanded_from feature, which handles macro-like expansion,
but another is providing context.
Before this commit, context was provided using an ad-hoc override
of a diagnostic engine, which did not work in cases where diagnostic
engine was not threaded through the call stack. This commit uses
the newly added pythonparser context feature to elegantly handle
the problem.