Struct datafusion_common::Column
source · pub struct Column {
pub relation: Option<OwnedTableReference>,
pub name: String,
}
Expand description
A named reference to a qualified field in a schema.
Fields§
§relation: Option<OwnedTableReference>
relation/table reference.
name: String
field/column name.
Implementations§
source§impl Column
impl Column
sourcepub fn new(
relation: Option<impl Into<OwnedTableReference>>,
name: impl Into<String>
) -> Self
pub fn new( relation: Option<impl Into<OwnedTableReference>>, name: impl Into<String> ) -> Self
Create Column from optional qualifier and name. The optional qualifier, if present, will be parsed and normalized by default.
See full details on TableReference::parse_str
sourcepub fn new_unqualified(name: impl Into<String>) -> Self
pub fn new_unqualified(name: impl Into<String>) -> Self
Convenience method for when there is no qualifier
sourcepub fn from_qualified_name(flat_name: impl Into<String>) -> Self
pub fn from_qualified_name(flat_name: impl Into<String>) -> Self
Deserialize a fully qualified name string into a column
Treats the name as a SQL identifier. For example
foo.BAR
would be parsed to a reference to relation foo
, column name bar
(lower case)
where "foo.BAR"
would be parsed to a reference to column named foo.BAR
sourcepub fn from_qualified_name_ignore_case(flat_name: impl Into<String>) -> Self
pub fn from_qualified_name_ignore_case(flat_name: impl Into<String>) -> Self
Deserialize a fully qualified name string into a column preserving column text case
sourcepub fn quoted_flat_name(&self) -> String
pub fn quoted_flat_name(&self) -> String
Serialize column into a quoted flat name string
sourcepub fn normalize_with_schemas(
self,
schemas: &[&Arc<DFSchema>],
using_columns: &[HashSet<Column>]
) -> Result<Self>
👎Deprecated since 20.0.0: use normalize_with_schemas_and_ambiguity_check instead
pub fn normalize_with_schemas( self, schemas: &[&Arc<DFSchema>], using_columns: &[HashSet<Column>] ) -> Result<Self>
Qualify column if not done yet.
If this column already has a relation, it will be returned as is and the given parameters are ignored. Otherwise this will search through the given schemas to find the column. This will use the first schema that matches.
A schema matches if there is a single column that – when unqualified – matches this column. There is an
exception for USING
statements, see below.
Using columns
Take the following SQL statement:
SELECT id FROM t1 JOIN t2 USING(id)
In this case, both t1.id
and t2.id
will match unqualified column id
. To express this possibility, use
using_columns
. Each entry in this array is a set of columns that are bound together via a USING
clause. So
in this example this would be [{t1.id, t2.id}]
.
sourcepub fn normalize_with_schemas_and_ambiguity_check(
self,
schemas: &[&[&DFSchema]],
using_columns: &[HashSet<Column>]
) -> Result<Self>
pub fn normalize_with_schemas_and_ambiguity_check( self, schemas: &[&[&DFSchema]], using_columns: &[HashSet<Column>] ) -> Result<Self>
Qualify column if not done yet.
If this column already has a relation, it will be returned as is and the given parameters are ignored. Otherwise this will search through the given schemas to find the column.
Will check for ambiguity at each level of schemas
.
A schema matches if there is a single column that – when unqualified – matches this column. There is an
exception for USING
statements, see below.
Using columns
Take the following SQL statement:
SELECT id FROM t1 JOIN t2 USING(id)
In this case, both t1.id
and t2.id
will match unqualified column id
. To express this possibility, use
using_columns
. Each entry in this array is a set of columns that are bound together via a USING
clause. So
in this example this would be [{t1.id, t2.id}]
.
Regarding ambiguity check, schemas
is structured to allow levels of schemas to be passed in.
For example:
schemas = &[
&[schema1, schema2], // first level
&[schema3, schema4], // second level
]
Will search for a matching field in all schemas in the first level. If a matching field according to above mentioned conditions is not found, then will check the next level. If found more than one matching column across all schemas in a level, that isn’t a USING column, will return an error due to ambiguous column.
If checked all levels and couldn’t find field, will return field not found error.
Trait Implementations§
source§impl Ord for Column
impl Ord for Column
source§impl PartialEq<Column> for Column
impl PartialEq<Column> for Column
source§impl PartialOrd<Column> for Column
impl PartialOrd<Column> for Column
1.0.0 · source§fn le(&self, other: &Rhs) -> bool
fn le(&self, other: &Rhs) -> bool
self
and other
) and is used by the <=
operator. Read moreimpl Eq for Column
impl StructuralEq for Column
impl StructuralPartialEq for Column
Auto Trait Implementations§
impl RefUnwindSafe for Column
impl Send for Column
impl Sync for Column
impl Unpin for Column
impl UnwindSafe for Column
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
§impl<Q, K> Equivalent<K> for Qwhere
Q: Eq + ?Sized,
K: Borrow<Q> + ?Sized,
impl<Q, K> Equivalent<K> for Qwhere Q: Eq + ?Sized, K: Borrow<Q> + ?Sized,
§fn equivalent(&self, key: &K) -> bool
fn equivalent(&self, key: &K) -> bool
§impl<Q, K> Equivalent<K> for Qwhere
Q: Eq + ?Sized,
K: Borrow<Q> + ?Sized,
impl<Q, K> Equivalent<K> for Qwhere Q: Eq + ?Sized, K: Borrow<Q> + ?Sized,
§fn equivalent(&self, key: &K) -> bool
fn equivalent(&self, key: &K) -> bool
key
and return true
if they are equal.