Expand description
Passing parameters to statement
In a nutshell
()
-> No parameter&a
-> Single input parameterInOut(&mut a)
-> Input Output parameterOut(&mut a)
-> Output parameter(a,b,c)
-> Fixed number of parameters&[a]
-> Arbitrary number of parameters&mut BlobParam
-> Stream long input parameters.Box<dyn InputParameter>
-> Aribtrary input parameter&[Box<dyn InputParameter>]
-> Aribtrary number of arbitrary input parametersa.into_parameter()
-> Convert idiomatic Rust type into something bindable by ODBC.
Passing a single parameter
ODBC allows you to bind parameters to positional placeholders. In the simples case it looks like this:
use odbc_api::Environment;
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
let year = 1980;
if let Some(cursor) = conn.execute("SELECT year, name FROM Birthdays WHERE year > ?;", &year)? {
// Use cursor to process query results.
}
All types implementing the Parameter
trait can be used.
Annotating a parameter with an explicit SQL DataType
In the last example we used a bit of domain knowledge about the query and provided it with an
i32
. Each Parameter
type comes with a default SQL Type as which it is bound. In the last
example this spared us from specifing that we bind year
as an SQL INTEGER
(because INTEGER
is default for i32
). If we want to, we can specify the SQL type independent from the Rust type
we are binding, by wrapping it in WithDataType
.
use odbc_api::{Environment, parameter::WithDataType, DataType};
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
let year = WithDataType{
value: 1980,
data_type: DataType::Varchar {length: 4}
};
if let Some(cursor) = conn.execute("SELECT year, name FROM Birthdays WHERE year > ?;", &year)? {
// Use cursor to process query results.
}
In that case it is likely that the driver manager converts our annotated year into a string which is most likely being converted back into an integer by the driver. All this converting can be confusing, but it is helpful if we do not know what types the parameters actually have (i.e. the query could have been entered by the user on the command line.). There is also an option to query the parameter types beforehand, but my advice is not trust the information blindly if you cannot test this with your driver beforehand.
Passing a fixed number of parameters
To pass multiple but a fixed number of parameters to a query you can use tuples.
use odbc_api::Environment;
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
let too_old = 1980;
let too_young = 2000;
if let Some(cursor) = conn.execute(
"SELECT year, name FROM Birthdays WHERE ? < year < ?;",
(&too_old, &too_young),
)? {
// Use cursor to congratulate only persons in the right age group...
}
Passing an arbitrary number of parameters
Not always do we know the number of required parameters at compile time. This might be the case if the query itself is generated from user input. Luckily slices of parameters are supported, too.
use odbc_api::Environment;
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
let params = [1980, 2000];
if let Some(cursor) = conn.execute(
"SELECT year, name FROM Birthdays WHERE ? < year < ?;",
¶ms[..])?
{
// Use cursor to process query results.
}
Passing an input parameters parsed from the command line
In case you want to read parameters from the command line you can also let ODBC do the work of converting the text input into something more suitable.
use odbc_api::{Connection, IntoParameter, Error, parameter::VarCharSlice};
fn execute_arbitrary_command(connection: &Connection, query: &str, parameters: &[&str])
-> Result<(), Error>
{
// Convert all strings to `VarCharSlice` and bind them as `VarChar`. Let ODBC convert them
// into something better matching the types required be the query.
let params: Vec<_> = parameters
.iter()
.map(|param| param.into_parameter())
.collect();
// Execute the query as a one off, and pass the parameters. String parameters are parsed and
// converted into something more suitable by the data source itself.
connection.execute(&query, params.as_slice())?;
Ok(())
}
Should you have more type information the type available, but only at runtime can also bind an
array of [Box<dyn InputParameter]
.
Output and Input/Output parameters
Mutable references are treated as input/output parameters. To use a parameter purely as an output parameter you may wrapt it into out. Consider a Mircosoft SQL Server with the following stored procedure:
CREATE PROCEDURE TestParam
@OutParm int OUTPUT
AS
SELECT @OutParm = @OutParm + 5
RETURN 99
GO
We bind the return value as the first output parameter. The second parameter is an input/output bound as a mutable reference.
use odbc_api::{Environment, Out, InOut, Nullable};
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
let mut ret = Nullable::<i32>::null();
let mut param = Nullable::<i32>::new(7);
conn.execute(
"{? = call TestParam(?)}",
(Out(&mut ret), InOut(&mut param)))?;
assert_eq!(Some(99), ret.into_opt());
assert_eq!(Some(7 + 5), param.into_opt());
Sending long data
Many ODBC drivers have size limits of how big parameters can be. Apart from that you may not
want to allocate really large buffers in your application in order to keep a small memory
footprint. Luckily ODBC also supports streaming data to the database batch by batch at statement
execution time. To support this, this crate offers the BlobParam
, which can be bound as a
mutable reference. An instance of BlobParam
is usually created by calling
Blob::as_blob_param
from a wrapper implenting Blob
.
Inserting long binary data from a file.
BlobRead::from_path
is the most convinient way to turn a file path into a Blob
parameter. The following example also demonstrates that the streamed blob parameter can be
combined with reqular input parmeters like id
.
use std::{error::Error, path::Path};
use odbc_api::{Connection, parameter::{Blob, BlobRead}, IntoParameter};
fn insert_image_to_db(
conn: &Connection<'_>,
id: &str,
image_path: &Path) -> Result<(), Box<dyn Error>>
{
let mut blob = BlobRead::from_path(&image_path)?;
let sql = "INSERT INTO Images (id, image_data) VALUES (?, ?)";
let parameters = (&id.into_parameter(), &mut blob.as_blob_param());
conn.execute(sql, parameters)?;
Ok(())
}
Inserting long binary data from any io::BufRead
.
This is more flexible than inserting just from files. Note however that files provide metadata
about the length of the data, which io::BufRead
does not. This is not an issue for most
drivers, but some can perform optimization if they know the size in advance. In the tests
SQLite has shown a bug to only insert empty data if no size hint has been provided.
use std::io::BufRead;
use odbc_api::{Connection, parameter::{Blob, BlobRead}, IntoParameter, Error};
fn insert_image_to_db(
conn: &Connection<'_>,
id: &str,
image_data: impl BufRead) -> Result<(), Error>
{
const MAX_IMAGE_SIZE: usize = 4 * 1024 * 1024;
let mut blob = BlobRead::with_upper_bound(image_data, MAX_IMAGE_SIZE);
let sql = "INSERT INTO Images (id, image_data) VALUES (?, ?)";
let parameters = (&id.into_parameter(), &mut blob.as_blob_param());
conn.execute(sql, parameters)?;
Ok(())
}
Inserting long strings
This example insert title
as a normal input parameter but streams the potentially much longer
String
in text
to the database as a large text blob. This allows to circumvent the size
restrictions for String
arguments of many drivers (usually around 4 or 8 KiB).
use odbc_api::{Connection, parameter::{Blob, BlobSlice}, IntoParameter, Error};
fn insert_book(
conn: &Connection<'_>,
title: &str,
text: &str
) -> Result<(), Error>
{
let mut blob = BlobSlice::from_text(text);
let insert = "INSERT INTO Books (title, text) VALUES (?,?)";
let parameters = (&title.into_parameter(), &mut blob.as_blob_param());
conn.execute(&insert, parameters)?;
Ok(())
}
Inserting long binary data from &[u8]
.
use odbc_api::{Connection, parameter::{Blob, BlobSlice}, IntoParameter, Error};
fn insert_image(
conn: &Connection<'_>,
id: &str,
image_data: &[u8]
) -> Result<(), Error>
{
let mut blob = BlobSlice::from_byte_slice(image_data);
let insert = "INSERT INTO Images (id, image_data) VALUES (?,?)";
let parameters = (&id.into_parameter(), &mut blob.as_blob_param());
conn.execute(&insert, parameters)?;
Ok(())
}
Passing the type you absolutely think should work, but does not.
Sadly not every type can be safely bound as something the ODBC C-API understands. Most prominent
among those is a Rust string slice (&str
).
use odbc_api::Environment;
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
// conn.execute("SELECT year FROM Birthdays WHERE name=?;", "Bernd")?; // <- compiler error.
Alas, not all is lost. We can still make use of the crate::IntoParameter
trait to convert it
into something that works.
use odbc_api::{Environment, IntoParameter};
let env = Environment::new()?;
let mut conn = env.connect("YourDatabase", "SA", "My@Test@Password1")?;
if let Some(cursor) = conn.execute(
"SELECT year FROM Birthdays WHERE name=?;",
&"Bernd".into_parameter())?
{
// Use cursor to process query results.
};
Conversion for &str
is not too expensive either. Just an integer more on the stack. Wait, the
type you wanted to use, but that I have conveniently not chosen in this example still does not
work? Well, in that case please open an issue or a pull request. crate::IntoParameter
can usually be
implemented entirely in safe code, and is a suitable spot to enable support for your custom
types.
Structs
Parameter type which can be used to bind a self::Blob
as parameter to a statement in order
for its contents to be streamed to the database at statement execution time.
Wraps an std::io::BufRead
and implements self::Blob
. Use this to stream contents from an
std::io::BufRead
to the database. The blob implementation is going to directly utilize the
Buffer of the std::io::BufRead
implementation, so the batch size is likely equal to that
capacity.
Wraps borrowed bytes with a batch_size and implements self::Blob
. Use this type to send long
array of bytes to the database.
Wraps a mutable reference. Use this wrapper in order to indicate that a mutable reference should be bound as an input / output parameter.
Wraps a mutable reference. Use this wrapper in order to indicate that a mutable reference should be bound as an output parameter only.
Binds a byte array as Variadic sized binary data. It can not be used for columnar bulk fetches, but if the buffer type is stack allocated it can be utilized in row wise bulk fetches.
Binds a byte array as Variadic sized character data. It can not be used for columnar bulk fetches, but if the buffer type is stack allocated it can be utilized in row wise bulk fetches.
Annotates an instance of an inner type with an SQL Data type in order to indicate how it should be bound as a parameter to an SQL Statement.
Traits
A Blob
can stream its contents to the database batch by batch and may therefore be used to
transfer large amounts of data, exceeding the drivers capabilities for normal input parameters.
Use implementations of this type as arguments to SQL Statements.
Safety
Implementers of this trait can be used as individual parameters of in a
crate::ParameterRefCollection
. They can be bound as either input parameters, output
parameters or both.
Safety
Type Definitions
A stack allocated VARBINARY type.
Parameter type for owned, variable sized binary data.
Binds a byte array as a variadic binary input parameter.
Wraps a slice so it can be used as an output parameter for binary data.
A stack allocated VARCHAR type.
Parameter type for owned, variable sized character data.
Binds a byte array as a VarChar input parameter.
Wraps a slice so it can be used as an output parameter for character data.