scanspec.core
#
Core classes like Frames
and Path
.
Members
A type variable for an Axis that can be specified for a scan |
|
Alternative axis variable to be used when two are required in the same type binding |
|
If x is of type cls then return func(x), otherwise return NotImplemented. |
|
Map of axes to float ndarray of points E.g. |
|
Represents a series of scan frames along a number of axes. |
|
Like a |
|
Is there a gap between end of frames1 and start of frames2. |
|
Squash a stack of nested Frames into a single one. |
|
A consumable route through a stack of Frames, representing a scan path. |
|
Convenience iterable that produces the scan midpoints for each axis. |
|
Add all subclasses of super_cls to a discriminated union. |
|
Used to ensure pydantic dataclasses error if given extra arguments |
- class scanspec.core.Axis#
A type variable for an Axis that can be specified for a scan
alias of TypeVar(‘Axis’, covariant=True)
- class scanspec.core.OtherAxis#
Alternative axis variable to be used when two are required in the same type binding
alias of TypeVar(‘OtherAxis’)
- scanspec.core.if_instance_do(x: C, cls: type[C], func: Callable[[C], T]) T [source]#
If x is of type cls then return func(x), otherwise return NotImplemented.
Used as a helper when implementing operator overloading.
- scanspec.core.AxesPoints#
Map of axes to float ndarray of points E.g. {xmotor: array([0, 1, 2]), ymotor: array([2, 2, 2])}
- class scanspec.core.Frames(midpoints: dict[Axis, ndarray[Any, dtype[floating[Any]]]], lower: dict[Axis, ndarray[Any, dtype[floating[Any]]]] | None = None, upper: dict[Axis, ndarray[Any, dtype[floating[Any]]]] | None = None, gap: ndarray[Any, dtype[bool]] | None = None)[source]#
Represents a series of scan frames along a number of axes.
During a scan each axis will traverse lower-midpoint-upper for each frame.
- Parameters:
midpoints – The midpoints of scan frames for each axis
lower – Lower bounds of scan frames if different from midpoints
upper – Upper bounds of scan frames if different from midpoints
gap – If supplied, define if there is a gap between frame and previous otherwise it is calculated by looking at lower and upper bounds
Typically used in two ways:
A list of Frames objects returned from
Spec.calculate
represents a scan as a linear stack of frames. Interpreted as nested from slowest moving to fastest moving, so each faster Frames object will iterate once per position of the slower Frames object. It is passed to aPath
for calculation of the actual scan path.A single Frames object returned from
Path.consume
represents a chunk of frames forming part of a scan path, for interpretation by the code that will actually perform the scan.
See also
- midpoints#
The midpoints of scan frames for each axis
- lower#
The lower bounds of each scan frame in each axis for fly-scanning
- upper#
The upper bounds of each scan frame in each axis for fly-scanning
- gap#
Whether there is a gap between this frame and the previous. First element is whether there is a gap between the last frame and the first
- extract(indices: ndarray[Any, dtype[signedinteger[Any]]], calculate_gap: bool = True) Frames[Axis] [source]#
Return a new Frames object restricted to the indices provided.
- Parameters:
indices – The indices of the frames to extract, modulo scan length
calculate_gap – If True then recalculate the gap from upper and lower
>>> frames = Frames({"x": np.array([1, 2, 3])}) >>> frames.extract(np.array([1, 0, 1])).midpoints {'x': array([2, 1, 2])}
- concat(other: Frames[Axis], gap: bool = False) Frames[Axis] [source]#
Return a new Frames object concatenating self and other.
Requires both Frames objects to have the same axes, but not necessarily in the same order. The order is inherited from self, so other may be reordered.
- Parameters:
other – The Frames to concatenate to self
gap – Whether to force a gap between the two Frames objects
>>> frames = Frames({"x": np.array([1, 2, 3]), "y": np.array([6, 5, 4])}) >>> frames2 = Frames({"y": np.array([3, 2, 1]), "x": np.array([4, 5, 6])}) >>> frames.concat(frames2).midpoints {'x': array([1, 2, 3, 4, 5, 6]), 'y': array([6, 5, 4, 3, 2, 1])}
- zip(other: Frames[Axis]) Frames[Axis] [source]#
Return a new Frames object merging self and other.
Require both Frames objects to not share axes.
>>> fx = Frames({"x": np.array([1, 2, 3])}) >>> fy = Frames({"y": np.array([5, 6, 7])}) >>> fx.zip(fy).midpoints {'x': array([1, 2, 3]), 'y': array([5, 6, 7])}
- class scanspec.core.SnakedFrames(midpoints: dict[Axis, ndarray[Any, dtype[floating[Any]]]], lower: dict[Axis, ndarray[Any, dtype[floating[Any]]]] | None = None, upper: dict[Axis, ndarray[Any, dtype[floating[Any]]]] | None = None, gap: ndarray[Any, dtype[bool]] | None = None)[source]#
Like a
Frames
object, but each alternate repetition will run in reverse.- classmethod from_frames(frames: Frames[OtherAxis]) SnakedFrames[OtherAxis] [source]#
Create a snaked version of a
Frames
object.
- extract(indices: ndarray[Any, dtype[signedinteger[Any]]], calculate_gap: bool = True) Frames[Axis] [source]#
Return a new Frames object restricted to the indices provided.
- Parameters:
indices – The indices of the frames to extract, can extend past len(self)
calculate_gap – If True then recalculate the gap from upper and lower
>>> frames = SnakedFrames({"x": np.array([1, 2, 3])}) >>> frames.extract(np.array([0, 1, 2, 3, 4, 5])).midpoints {'x': array([1, 2, 3, 3, 2, 1])}
- scanspec.core.gap_between_frames(frames1: Frames[Axis], frames2: Frames[Axis]) bool [source]#
Is there a gap between end of frames1 and start of frames2.
- scanspec.core.squash_frames(stack: list[Frames[Axis]], check_path_changes: bool = True) Frames[Axis] [source]#
Squash a stack of nested Frames into a single one.
- Parameters:
stack – The Frames stack to squash, from slowest to fastest moving
check_path_changes – If True then check that nesting the output Frames object within others will provide the same path as nesting the input Frames stack within others
>>> fx = SnakedFrames({"x": np.array([1, 2])}) >>> fy = Frames({"y": np.array([3, 4])}) >>> squash_frames([fy, fx]).midpoints {'y': array([3, 3, 4, 4]), 'x': array([1, 2, 2, 1])}
- class scanspec.core.Path(stack: list[Frames[Axis]], start: int = 0, num: int | None = None)[source]#
A consumable route through a stack of Frames, representing a scan path.
- Parameters:
stack – The Frames stack describing the scan, from slowest to fastest moving
start – The index of where in the Path to start
num – The number of scan frames to produce after start. None means up to the end
See also
- stack#
The Frames stack describing the scan, from slowest to fastest moving
- index#
Index that is next to be consumed
- lengths#
The lengths of all the stack
- end_index#
Index of the end frame, one more than the last index that will be produced
- consume(num: int | None = None) Frames[Axis] [source]#
Consume at most num frames from the Path and return as a Frames object.
>>> fx = SnakedFrames({"x": np.array([1, 2])}) >>> fy = Frames({"y": np.array([3, 4])}) >>> path = Path([fy, fx]) >>> path.consume(3).midpoints {'y': array([3, 3, 4]), 'x': array([1, 2, 2])} >>> path.consume(3).midpoints {'y': array([4]), 'x': array([1])} >>> path.consume(3).midpoints {'y': array([], dtype=int64), 'x': array([], dtype=int64)}
- class scanspec.core.Midpoints(stack: list[Frames[Axis]])[source]#
Convenience iterable that produces the scan midpoints for each axis.
For better performance, consume from a
Path
instead.- Parameters:
stack – The stack of Frames describing the scan, from slowest to fastest moving
See also
>>> fx = SnakedFrames({"x": np.array([1, 2])}) >>> fy = Frames({"y": np.array([3, 4])}) >>> mp = Midpoints([fy, fx]) >>> for p in mp: print(p) {'y': np.int64(3), 'x': np.int64(1)} {'y': np.int64(3), 'x': np.int64(2)} {'y': np.int64(4), 'x': np.int64(2)} {'y': np.int64(4), 'x': np.int64(1)}
- stack#
The stack of Frames describing the scan, from slowest to fastest moving
- scanspec.core.discriminated_union_of_subclasses(super_cls: type[C], discriminator: str = 'type') type[C] [source]#
Add all subclasses of super_cls to a discriminated union.
For all subclasses of super_cls, add a discriminator field to identify the type. Raw JSON should look like {<discriminator>: <type name>, params for <type name>…}.
Subclasses that extend this class must be Pydantic dataclasses, and types that need their schema to be updated when a new type that extends super_cls is created must be either Pydantic dataclasses or BaseModels.
Example:
@discriminated_union_of_subclasses class Expression(ABC): @abstractmethod def calculate(self) -> int: ... @dataclass class Add(Expression): left: Expression right: Expression def calculate(self) -> int: return self.left.calculate() + self.right.calculate() @dataclass class Subtract(Expression): left: Expression right: Expression def calculate(self) -> int: return self.left.calculate() - self.right.calculate() @dataclass class IntLiteral(Expression): value: int def calculate(self) -> int: return self.value my_sum = Add(IntLiteral(5), Subtract(IntLiteral(10), IntLiteral(2))) assert my_sum.calculate() == 13 assert my_sum == parse_obj_as( Expression, { "type": "Add", "left": {"type": "IntLiteral", "value": 5}, "right": { "type": "Subtract", "left": {"type": "IntLiteral", "value": 10}, "right": {"type": "IntLiteral", "value": 2}, }, }, )
- Parameters:
super_cls – The superclass of the union, Expression in the above example
discriminator – The discriminator that will be inserted into the serialized documents for type determination. Defaults to “type”.
- Returns:
- decorated superclass with handling for subclasses to be added
to its discriminated union for deserialization
- Return type:
Type
- scanspec.core.StrictConfig: ConfigDict = {'extra': 'forbid'}#
Used to ensure pydantic dataclasses error if given extra arguments