pub struct Predictor<const N_SHARDS: usize> { /* private fields */ }
Expand description
Cacheability Predictor
Remembers previously uncacheable assets. Allows bypassing cache / cache lock early based on historical precedent.
NOTE: to simply avoid caching requests with certain characteristics, add checks in request_cache_filter to avoid enabling cache in the first place. The predictor’s bypass mechanism handles cases where the request looks cacheable but its previous responses suggest otherwise. The request could be cacheable in the future.
Implementations§
Source§impl<const N_SHARDS: usize> Predictor<N_SHARDS>
impl<const N_SHARDS: usize> Predictor<N_SHARDS>
Sourcepub fn new(
shard_capacity: usize,
skip_custom_reasons_fn: Option<CustomReasonPredicate>,
) -> Predictor<N_SHARDS>
pub fn new( shard_capacity: usize, skip_custom_reasons_fn: Option<CustomReasonPredicate>, ) -> Predictor<N_SHARDS>
Create a new Predictor with N_SHARDS * shard_capacity
total capacity for
uncacheable cache keys.
shard_capacity
: defines number of keys remembered as uncacheable per LRU shard.skip_custom_reasons_fn
: an optional predicate used inmark_uncacheable
that can customize whichCustom
NoCacheReason
s ought to be remembered as uncacheable. If the predicate returns true, then the predictor will skip remembering the current cache key as uncacheable (and avoid bypassing cache on the next request).
Trait Implementations§
Source§impl<const N_SHARDS: usize> CacheablePredictor for Predictor<N_SHARDS>
impl<const N_SHARDS: usize> CacheablePredictor for Predictor<N_SHARDS>
Source§fn cacheable_prediction(&self, key: &CacheKey) -> bool
fn cacheable_prediction(&self, key: &CacheKey) -> bool
Return true if likely cacheable, false if likely not.
Source§fn mark_cacheable(&self, key: &CacheKey) -> bool
fn mark_cacheable(&self, key: &CacheKey) -> bool
Mark cacheable to allow next request to cache.
Returns false if the key was already marked cacheable.
Source§fn mark_uncacheable(
&self,
key: &CacheKey,
reason: NoCacheReason,
) -> Option<bool>
fn mark_uncacheable( &self, key: &CacheKey, reason: NoCacheReason, ) -> Option<bool>
Mark uncacheable to actively bypass cache on the next request.
May skip marking on certain NoCacheReasons.
Returns None if we skipped marking uncacheable.
Returns Some(false) if the key was already marked uncacheable.
Auto Trait Implementations§
impl<const N_SHARDS: usize> !Freeze for Predictor<N_SHARDS>
impl<const N_SHARDS: usize> !RefUnwindSafe for Predictor<N_SHARDS>
impl<const N_SHARDS: usize> Send for Predictor<N_SHARDS>
impl<const N_SHARDS: usize> Sync for Predictor<N_SHARDS>
impl<const N_SHARDS: usize> Unpin for Predictor<N_SHARDS>
impl<const N_SHARDS: usize> UnwindSafe for Predictor<N_SHARDS>
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more