Class Batch<T>
Represents a queue of elements, which are processed in batches, automatically or on demand.
public abstract class Batch<T> : IBatch<T>, IDisposable, IAsyncDisposable
Type Parameters
T
Element type.
- Inheritance
-
Batch<T>
- Implements
-
IBatch<T>
- Inherited Members
- Extension Methods
Constructors
Batch(BatchQueueOverflowStrategy, int, int, int)
Creates a new Batch<T> instance.
protected Batch(BatchQueueOverflowStrategy queueOverflowStrategy = BatchQueueOverflowStrategy.DiscardLast, int autoFlushCount = 1000, int queueSizeLimitHint = 100000000, int minInitialCapacity = 0)
Parameters
queueOverflowStrategy
BatchQueueOverflowStrategySpecifies the maximum number of enqueued elements that, when exceeded while adding new elements, will cause this batch to react according to its QueueOverflowStrategy. Equal to DiscardLast by default.
autoFlushCount
intSpecifies the number of enqueued elements, which acts as a threshold that, when reached while adding new elements, will cause this batch to automatically Flush() itself. Equal to 1 000 by default.
queueSizeLimitHint
intSpecifies the maximum number of enqueued elements that, when exceeded while adding new elements, will cause this batch to react according to its QueueOverflowStrategy. Equal to 100 000 000 by default.
minInitialCapacity
intSpecifies minimum initial capacity of the internal queue. Equal to 0 by default.
Exceptions
- ArgumentOutOfRangeException
When
autoFlushCount
is less than 1.
Properties
AutoFlushCount
Specifies the number of enqueued elements, which acts as a threshold that, when reached while adding new elements, will cause this batch to automatically Flush() itself.
public int AutoFlushCount { get; }
Property Value
Count
Specifies the number of elements currently waiting for processing in this batch.
public int Count { get; }
Property Value
QueueOverflowStrategy
Specifies the strategy to use when handling overflow of enqueued elements.
public BatchQueueOverflowStrategy QueueOverflowStrategy { get; }
Property Value
Remarks
See QueueSizeLimitHint and BatchQueueOverflowStrategy for more details.
QueueSizeLimitHint
Specifies the maximum number of enqueued elements that, when exceeded while adding new elements, will cause this batch to react according to its QueueOverflowStrategy.
public int QueueSizeLimitHint { get; }
Property Value
Remarks
See BatchQueueOverflowStrategy for more details.
Methods
Add(T)
Adds the provided element to this batch's queue.
public bool Add(T item)
Parameters
item
TElement to enqueue.
Returns
- bool
true when element was enqueued successfully, otherwise false.
AddRange(IEnumerable<T>)
Adds the provided range of elements to this batch's queue.
public bool AddRange(IEnumerable<T> items)
Parameters
items
IEnumerable<T>Range of elements to enqueue.
Returns
- bool
true when elements were enqueued successfully, otherwise false.
AddRange(ReadOnlySpan<T>)
Adds the provided range of elements to this batch's queue.
public bool AddRange(ReadOnlySpan<T> items)
Parameters
items
ReadOnlySpan<T>Range of elements to enqueue.
Returns
- bool
true when elements were enqueued successfully, otherwise false.
Dispose()
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
public void Dispose()
DisposeAsync()
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources asynchronously.
public ValueTask DisposeAsync()
Returns
- ValueTask
A task that represents the asynchronous dispose operation.
Flush()
Signals this batch to dequeue all of its elements, in an attempt to process them.
public bool Flush()
Returns
- bool
true when operation was initiated successfully, otherwise false.
OnDequeued(ReadOnlyMemory<T>, bool)
Allows to react to a non-empty range of elements being dequeued and made ready to process.
protected virtual void OnDequeued(ReadOnlyMemory<T> items, bool disposing)
Parameters
items
ReadOnlyMemory<T>Range of dequeued elements.
disposing
boolSpecifies whether or not this batch is in the process of being disposed.
Remarks
Exceptions thrown by this method will be completely ignored. Disposing the batch from inside this method may cause a deadlock.
OnDiscarding(QueueSlimMemory<T>, bool)
Allows to react to a non-empty range of elements being discarded due to QueueOverflowStrategy or due to this batch being disposed.
protected virtual void OnDiscarding(QueueSlimMemory<T> items, bool disposing)
Parameters
items
QueueSlimMemory<T>Range of elements to be discarded.
disposing
boolSpecifies whether or not this batch is in the process of being disposed.
Remarks
Exceptions thrown by this method will be completely ignored.
OnDisposed()
Allows to react to the batch being disposed.
protected virtual void OnDisposed()
Remarks
Exceptions thrown by this method will be completely ignored. The batch will not have any enqueued elements at the moment of invocation of this method.
OnEnqueued(QueueSlimMemory<T>, bool)
Allows to react to a non-empty range of elements being enqueued.
protected virtual void OnEnqueued(QueueSlimMemory<T> items, bool autoFlushing)
Parameters
items
QueueSlimMemory<T>Range of enqueued elements.
autoFlushing
boolSpecifies whether or not this batch will be automatically flushed due to the combination of its AutoFlushCount and new elements being added.
Remarks
Exceptions thrown by this method will be completely ignored.
ProcessAsync(ReadOnlyMemory<T>, bool)
Asynchronously processes provided non-empty range of dequeued elements.
protected abstract ValueTask<int> ProcessAsync(ReadOnlyMemory<T> items, bool disposing)
Parameters
items
ReadOnlyMemory<T>Range of elements to process. Number of elements will not exceed AutoFlushCount.
disposing
boolSpecifies whether or not this batch is in the process of being disposed.
Returns
Remarks
Exceptions thrown by this method will be completely ignored. If the returned number of successfully processed elements is less than 1, then the batch will stop attempting to process enqueued elements further, until the next Flush() invocation or automatic flushing due to AutoFlushCount. If the returned number of successfully processed elements is greater than 0 but less than the number of provided elements to process, then the batch will treat elements at the start of the provided range as processed and the remaining elements will stay in the buffer. Disposing the batch from inside this method may cause a deadlock.