#[repr(C)]
pub struct MemBump { /* private fields */ }
Expand description

A dynamically sized allocation block in which any type can be allocated.

Implementations§

source§

impl MemBump

source

pub fn new(capacity: usize) -> Box<Self>

Allocate some space to use for a bump allocator.

source§

impl MemBump

source

pub fn from_mem( mem: &mut [MaybeUninit<u8>] ) -> Result<LeakBox<'_, Self>, FromMemError>

Initialize a bump allocator from existing memory.

§Usage
use core::mem::MaybeUninit;
use static_alloc::unsync::MemBump;

let mut backing = [MaybeUninit::new(0); 128];
let alloc = MemBump::from_mem(&mut backing)?;
source

pub unsafe fn from_mem_unchecked( mem: &mut [MaybeUninit<u8>] ) -> LeakBox<'_, Self>

Construct a bump allocator from existing memory without reinitializing.

This allows the caller to (unsafely) fallback to manual borrow checking of the memory region between regions of allocator use.

§Safety

The memory must contain data that has been previously wrapped as a MemBump, exactly. The only endorsed sound form of obtaining such memory is MemBump::into_mem.

Warning: Any use of the memory will have invalidated all pointers to allocated objects, more specifically the provenance of these pointers is no longer valid! You must derive new pointers based on their offsets.

source

pub fn into_mem<'lt>(this: LeakBox<'lt, Self>) -> &'lt mut [MaybeUninit<u8>]

Unwrap the memory owned by an unsized bump allocator.

This releases the memory used by the allocator, similar to Box::leak, with the difference of operating on unique references instead. It is necessary to own the bump allocator due to internal state contained within the memory region that the caller can subsequently invalidate.

§Example
use core::mem::MaybeUninit;
use static_alloc::unsync::MemBump;

let memory: &mut [_] = MemBump::into_mem(alloc);
assert!(memory.len() <= 128, "Not guaranteed to use all memory");

// Safety: We have not touched the memory itself.
unsafe { MemBump::from_mem_unchecked(memory) };
source

pub const fn capacity(&self) -> usize

Returns capacity of this MemBump. This is how many bytes can be allocated within this node.

source

pub fn data_ptr(&self) -> NonNull<u8>

Get a raw pointer to the data.

Note that any use of the pointer must be done with extreme care as it may invalidate existing references into the allocated region. Furthermore, bytes may not be initialized. The length of the valid region is MemBump::capacity.

Prefer MemBump::get_unchecked for reconstructing a prior allocation.

source

pub fn alloc(&self, layout: Layout) -> Option<NonNull<u8>>

Allocate a region of memory.

This is a safe alternative to GlobalAlloc::alloc.

§Panics

This function will panic if the requested layout has a size of 0. For the use in a GlobalAlloc this is explicitely forbidden to request and would allow any behaviour but we instead strictly check it.

FIXME(breaking): this could well be a Result<_, Failure>.

source

pub fn alloc_at( &self, layout: Layout, level: Level ) -> Result<NonNull<u8>, Failure>

Try to allocate some layout with a precise base location.

The base location is the currently consumed byte count, without correction for the alignment of the allocation. This will succeed if it can be allocate exactly at the expected location.

§Panics

This function may panic if the provided level is from a different slab.

source

pub fn get<V>(&self) -> Option<Allocation<'_, V>>

Get an allocation for a specific type.

It is not yet initialized but provides an interface for that initialization.

§Usage
use core::cell::{Ref, RefCell};

let slab: Bump<[Ref<'static, usize>; 1]> = Bump::uninit();
let data = RefCell::new(0xff);

// We can place a `Ref` here but we did not yet.
let alloc = slab.get::<Ref<usize>>().unwrap();
let cell_ref = unsafe {
    alloc.leak(data.borrow())
};

assert_eq!(**cell_ref, 0xff);

FIXME(breaking): this could well be a Result<_, Failure>.

source

pub fn get_at<V>(&self, level: Level) -> Result<Allocation<'_, V>, Failure>

Get an allocation for a specific type at a specific level.

See get for usage. This can be used to ensure that data is contiguous in concurrent access to the allocator.

source

pub unsafe fn get_unchecked<V>(&self, level: Level) -> Allocation<'_, V>

Reacquire an allocation that has been performed previously.

This call won’t invalidate any other allocations.

§Safety

The caller must guarantee that no other pointers to this prior allocation are alive, or can be created. This is guaranteed if the allocation was performed previously, has since been discarded, and reset can not be called (for example, the caller holds a shared reference).

§Usage
// Create an initial allocation.
let level = alloc.level();
let allocation = alloc.get_at::<usize>(level)?;
let address = allocation.ptr.as_ptr() as usize;
// pretend to lose the owning pointer of the allocation.
let _ = { allocation };

// Restore our access.
let renewed = unsafe { alloc.get_unchecked::<usize>(level) };
assert_eq!(address, renewed.ptr.as_ptr() as usize);

Critically, you can rely on other allocations to stay valid.

let level = alloc.level();
alloc.get_at::<usize>(level)?;

let other_val = alloc.bump_box()?;
let other_val = LeakBox::write(other_val, 0usize);

let renew = unsafe { alloc.get_unchecked::<usize>(level) };
assert_eq!(*other_val, 0); // Not UB!
source

pub fn bump_box<'bump, T: 'bump>( &'bump self ) -> Result<LeakBox<'bump, MaybeUninit<T>>, Failure>

Allocate space for one T without initializing it.

Note that the returned MaybeUninit can be unwrapped from LeakBox. Or you can store an arbitrary value and ensure it is safely dropped before the borrow ends.

§Usage
use core::cell::RefCell;
use static_alloc::leaked::LeakBox;

let slab: Bump<[usize; 4]> = Bump::uninit();
let data = RefCell::new(0xff);

let slot = slab.bump_box().unwrap();
let cell_box = LeakBox::write(slot, data.borrow());

assert_eq!(**cell_box, 0xff);
drop(cell_box);

assert!(data.try_borrow_mut().is_ok());

FIXME(breaking): should return evidence of the level (observed, and post). Something similar to Allocation but containing a LeakBox<T> instead? Introduce that to the sync Bump allocator as well.

FIXME(breaking): align with sync Bump::get (probably rename get to bump_box).

source

pub fn bump_array<'bump, T: 'bump>( &'bump self, n: usize ) -> Result<LeakBox<'bump, [MaybeUninit<T>]>, Failure>

Allocate space for a slice of Ts without initializing any.

Retrieve individual MaybeUninit elements and wrap them as a LeakBox to store values. Or use the slice as backing memory for one of the containers from without-alloc. Or manually initialize them.

§Usage

Quicksort, implemented recursively, requires a maximum of log n stack frames in the worst case when implemented optimally. Since each frame is quite large this is wasteful. We can use a properly sized buffer instead and implement an iterative solution. (Left as an exercise to the reader, or see the examples for without-alloc where we use such a dynamic allocation with an inline vector as our stack).

source

pub fn level(&self) -> Level

Get the number of already allocated bytes.

source

pub fn reset(&mut self)

Reset the bump allocator.

This requires a unique reference to the allocator hence no allocation can be alive at this point. It will reset the internal count of used bytes to zero.

Auto Trait Implementations§

Blanket Implementations§

source§

impl<T> Any for T
where T: 'static + ?Sized,

source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
source§

impl<T> Borrow<T> for T
where T: ?Sized,

source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more