SingleStoreDB

https://www.singlestore.com

Install

Install Ibis and dependencies for the SingleStoreDB backend:

Install with the singlestoredb extra:

pip install 'ibis-framework[singlestoredb]'

And connect:

import ibis

con = ibis.singlestoredb.connect()
1
Adjust connection parameters as needed.

Install for SingleStoreDB:

conda install -c conda-forge ibis-singlestoredb

And connect:

import ibis

con = ibis.singlestoredb.connect()
1
Adjust connection parameters as needed.

Install for SingleStoreDB:

mamba install -c conda-forge ibis-singlestoredb

And connect:

import ibis

con = ibis.singlestoredb.connect()
1
Adjust connection parameters as needed.

Connect

ibis.singlestoredb.connect

con = ibis.singlestoredb.connect(
    user="username",
    password="password",
    host="hostname",
    port=3306,
    database="database",
)
Note

ibis.singlestoredb.connect is a thin wrapper around ibis.backends.singlestoredb.Backend.do_connect.

Connection Parameters

do_connect

do_connect(self, host=None, user=None, password=None, port=None, database=None, driver=None, autocommit=True, local_infile=True, **kwargs)

Create an Ibis client connected to a SingleStoreDB database.

Parameters
Name Type Description Default
host str Hostname or URL None
user str Username None
password str Password None
port int Port number None
database str Database to connect to None
driver str Driver name: mysql, https, http None
autocommit bool Whether to autocommit transactions True
local_infile bool Enable LOAD DATA LOCAL INFILE support True
kwargs dict Additional keyword arguments passed to the underlying client {}
Examples
>>> import os
>>> import ibis
>>> host = os.environ.get("IBIS_TEST_SINGLESTOREDB_HOST", "localhost")
>>> user = os.environ.get("IBIS_TEST_SINGLESTOREDB_USER", "root")
>>> password = os.environ.get("IBIS_TEST_SINGLESTOREDB_PASSWORD", "ibis_testing")
>>> database = os.environ.get("IBIS_TEST_SINGLESTOREDB_DATABASE", "ibis_testing")
>>> port = int(os.environ.get("IBIS_TEST_SINGLESTOREDB_PORT", "3307"))
>>> con = ibis.singlestoredb.connect(
...     database=database, host=host, user=user, password=password, port=port
... )
>>> con.list_tables()
[...]
>>> t = con.table("functional_alltypes")
>>> t
DatabaseTable: functional_alltypes
  id              int32
  bool_col        boolean
  tinyint_col     int8
  smallint_col    int16
  int_col         int32
  bigint_col      int64
  float_col       float32
  double_col      float64
  date_string_col string
  string_col      string
  timestamp_col   timestamp
  year            int32
  month           int32

ibis.connect URL format

In addition to ibis.singlestoredb.connect, you can also connect to SingleStoreDB by passing a properly-formatted SingleStoreDB connection URL to ibis.connect:

con = ibis.connect(f"singlestoredb://{user}:{password}@{host}:{port}/{database}")

singlestoredb.Backend

begin

begin(self)

Begin a transaction context for executing SQL commands.

This method provides a cursor context manager that automatically handles transaction lifecycle including rollback on exceptions and proper cleanup.

Returns

Name Type Description
Cursor SingleStoreDB cursor for executing SQL commands

Examples

>>> with con.begin() as cur:
...     cur.execute("SELECT COUNT(*) FROM users")
...     result = cur.fetchone()

compile

compile(self, expr, /, *, limit=None, params=None, pretty=False)

Compile an expression to a SQL string.

Parameters

Name Type Description Default
expr ir.Expr An ibis expression to compile. required
limit int | None An integer to effect a specific row limit. A value of None means no limit. None
params Mapping[ir.Expr, Any] | None Mapping of scalar parameter expressions to value. None
pretty bool Pretty print the SQL query during compilation. False

Returns

Name Type Description
str Compiled expression

connect

connect(self, *args, **kwargs)

Connect to the database.

Parameters

Name Type Description Default
*args Mandatory connection parameters, see the docstring of do_connect for details. ()
**kwargs Extra connection parameters, see the docstring of do_connect for details. {}

Notes

This creates a new backend instance with saved args and kwargs, then calls reconnect and finally returns the newly created and connected backend instance.

Returns

Name Type Description
BaseBackend An instance of the backend

create_database

create_database(self, name, force=False)

Create a database in SingleStoreDB.

Parameters

Name Type Description Default
name str Name of the database to create required
force bool If True, use CREATE DATABASE IF NOT EXISTS False

Examples

>>> con.create_database("my_database")
>>> con.create_database("my_database", force=True)

create_table

create_table(self, name, /, obj=None, *, schema=None, database=None, temp=False, overwrite=False)

Create a table in SingleStoreDB.

Parameters

Name Type Description Default
name str Name of the table to create required
obj ir.Table | pd.DataFrame | pa.Table | pl.DataFrame | pl.LazyFrame | None Data to insert into the table. Can be an Ibis table expression, pandas DataFrame, PyArrow table, or Polars DataFrame/LazyFrame None
schema sch.SchemaLike | None Schema for the table. If None, inferred from obj None
database str | None Database to create the table in. If None, uses current database None
temp bool Create a temporary table False
overwrite bool Replace the table if it already exists False

Returns

Name Type Description
Table The created table expression

Examples

>>> import pandas as pd
>>> df = pd.DataFrame({"x": [1, 2, 3], "y": ["a", "b", "c"]})
>>> table = con.create_table("my_table", df)

Create with explicit schema:

>>> import ibis
>>> schema = ibis.schema({"id": "int64", "name": "string"})
>>> table = con.create_table("users", schema=schema)

Create temporary table:

>>> temp_table = con.create_table("temp_data", df, temp=True)

create_view

create_view(self, name, /, obj, *, database=None, overwrite=False)

Create a view from an Ibis expression.

Parameters

Name Type Description Default
name str The name of the view to create. required
obj ir.Table The Ibis expression to create the view from. required
database str | None The database that the view should be created in. None
overwrite bool If True, replace an existing view with the same name. False

Returns

Name Type Description
ir.Table A table expression representing the view.

disconnect

disconnect(self)

Disconnect from the backend.

drop_database

drop_database(self, name, *, catalog=None, force=False)

Drop a database from SingleStoreDB.

Parameters

Name Type Description Default
name str Name of the database to drop required
catalog str | None Name of the catalog (not used in SingleStore, for compatibility) None
force bool If True, use DROP DATABASE IF EXISTS to avoid errors if database doesn’t exist False

Examples

>>> con.drop_database("my_database")
>>> con.drop_database("my_database", force=True)

drop_table

drop_table(self, name, /, *, database=None, force=False)

Drop a table from the database.

Parameters

Name Type Description Default
name str Table name to drop required
database str | None Database name None
force bool Use IF EXISTS clause when dropping False

drop_view

drop_view(self, name, /, *, database=None, force=False)

Drop a view from the backend.

Parameters

Name Type Description Default
name str The name of the view to drop. required
database str | None The database that the view is located in. None
force bool If True, do not raise an error if the view does not exist. False

execute

execute(self, expr, /, *, params=None, limit=None, **kwargs)

Execute an Ibis expression and return a pandas DataFrame, Series, or scalar.

Parameters

Name Type Description Default
expr ir.Expr Ibis expression to execute. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
limit int | str | None An integer to effect a specific row limit. A value of None means no limit. The default is in ibis/config.py. None
kwargs Any Keyword arguments {}

from_connection

from_connection(cls, con, /)

Create an Ibis client from an existing connection to a SingleStoreDB database.

Parameters

Name Type Description Default
con Connection An existing connection to a SingleStoreDB database. required

get_schema

get_schema(self, name, *, catalog=None, database=None)

Get schema for a table in SingleStoreDB.

NoteIbis does not use the word schema to refer to database hierarchy.

A collection of table is referred to as a database. A collection of database is referred to as a catalog.

These terms are mapped onto the corresponding features in each backend (where available), regardless of the terminology the backend uses.

See the Table Hierarchy Concepts Guide for more info.

Parameters

Name Type Description Default
name str Table name to get schema for required
catalog str | None Catalog name (usually not used in SingleStoreDB) None
database str | None Database name. If None, uses current database None

Returns

Name Type Description
Schema Ibis schema object with column names and types

Examples

>>> schema = con.get_schema("users")
>>> print(schema)
Schema:
  id: int64
  name: string
  email: string
  created_at: timestamp

has_operation

has_operation(cls, operation, /)

Return whether the backend supports the given operation.

Parameters

Name Type Description Default
operation type[ops.Value] Operation type, a Python class object. required

insert

insert(self, name, /, obj, *, database=None, overwrite=False)

Insert data into a table.

NoteIbis does not use the word schema to refer to database hierarchy.

A collection of table is referred to as a database. A collection of database is referred to as a catalog.

These terms are mapped onto the corresponding features in each backend (where available), regardless of whether the backend itself uses the same terminology.

Parameters

Name Type Description Default
name str The name of the table to which data will be inserted required
obj pd.DataFrame | ir.Table | list | dict The source data or expression to insert required
database str | None Name of the attached database that the table is located in. For backends that support multi-level table hierarchies, you can pass in a dotted string path like "catalog.database" or a tuple of strings like ("catalog", "database"). None
overwrite bool If True then replace existing contents of table False

list_databases

list_databases(self, *, like=None)

Return the list of databases.

Parameters

Name Type Description Default
like str | None A pattern in Python’s regex format to filter returned database names. None

Returns

Name Type Description
list[str] The database names that match the pattern like.

list_tables

list_tables(self, like=None, database=None)

List tables in SingleStoreDB database.

Parameters

Name Type Description Default
like str | None SQL LIKE pattern to filter table names. Use ‘%’ as wildcard, e.g., ‘user_%’ for tables starting with ‘user_’ None
database tuple[str, str] | str | None Database to list tables from. If None, uses current database. Tuples are used to specify (catalog, database), but catalogs are not supported in SingleStoreDB, so this is for compatibility only. None

Returns

Name Type Description
list[str] List of table names in the specified database

Examples

>>> con.list_tables()
['users', 'orders', 'products']
>>> con.list_tables(like="user_%")
['users', 'user_profiles']
>>> con.list_tables(database="other_db")
['table1', 'table2']

raw_sql

raw_sql(self, query, **kwargs)

Execute a raw SQL query and return the cursor.

Parameters

Name Type Description Default
query str | sg.Expression SQL query string or SQLGlot expression to execute required
kwargs Any Additional parameters to pass to the query execution {}

Returns

Name Type Description
Cursor Database cursor with query results

Examples

>>> cursor = con.raw_sql("SELECT * FROM users WHERE id = %s", (123,))
>>> results = cursor.fetchall()
>>> cursor.close()

Using with context manager:

>>> with con.raw_sql("SHOW TABLES") as cursor:
...     tables = [row[0] for row in cursor.fetchall()]

read_csv

read_csv(self, path, /, *, table_name=None, **kwargs)

Register a CSV file as a table in SingleStoreDB.

Parameters

Name Type Description Default
path str | Path Path to CSV file(s). Supports glob patterns. required
table_name str | None Optional name for the table. Generated if not provided. None
**kwargs Any Reserved for future use. {}

Returns

Name Type Description
ir.Table The registered table

read_delta

read_delta(self, path, /, *, table_name=None, **kwargs)

Register a Delta Lake table in the current database.

Parameters

Name Type Description Default
path str | Path The data source. Must be a directory containing a Delta Lake table. required
table_name str | None An optional name to use for the created table. This defaults to a sequentially generated name. None
**kwargs Any Additional keyword arguments passed to the underlying backend or library. {}

Returns

Name Type Description
ir.Table The just-registered table.

read_json

read_json(self, path, /, *, table_name=None, **kwargs)

Register a JSON file as a table in the current backend.

Parameters

Name Type Description Default
path str | Path The data source. A string or Path to the JSON file. required
table_name str | None An optional name to use for the created table. This defaults to a sequentially generated name. None
**kwargs Any Additional keyword arguments passed to the backend loading function. {}

Returns

Name Type Description
ir.Table The just-registered table

read_parquet

read_parquet(self, path, /, *, table_name=None, **kwargs)

Register a parquet file as a table in the current backend.

Parameters

Name Type Description Default
path str | Path The data source. required
table_name str | None An optional name to use for the created table. This defaults to a sequentially generated name. None
**kwargs Any Additional keyword arguments passed to the backend loading function. {}

Returns

Name Type Description
ir.Table The just-registered table

reconnect

reconnect(self)

Reconnect to the database already configured with connect.

register_options

register_options(cls)

Register custom backend options.

rename_table

rename_table(self, old_name, new_name)

Rename a table in SingleStoreDB.

Parameters

Name Type Description Default
old_name str Current name of the table required
new_name str New name for the table required

Examples

>>> con.rename_table("old_table", "new_table")

sql

sql(self, query, /, *, schema=None, dialect=None)

Create an Ibis table expression from a SQL query.

Parameters

Name Type Description Default
query str A SQL query string required
schema SchemaLike | None The schema of the query. If not provided, Ibis will try to infer the schema of the query. None
dialect str | None The SQL dialect of the query. If not provided, the backend’s dialect is assumed. This argument can be useful when the query is written in a different dialect from the backend. None

Returns

Name Type Description
ir.Table The table expression representing the query

table

table(self, name, /, *, database=None)

to_csv

to_csv(self, expr, /, path, *, params=None, **kwargs)

Write the results of executing the given expression to a CSV file.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Table The ibis expression to execute and persist to CSV. required
path str | Path The data source. A string or Path to the CSV file. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
kwargs Any Additional keyword arguments passed to pyarrow.csv.CSVWriter {}
https required

to_delta

to_delta(self, expr, /, path, *, params=None, **kwargs)

Write the results of executing the given expression to a Delta Lake table.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Table The ibis expression to execute and persist to Delta Lake table. required
path str | Path The data source. A string or Path to the Delta Lake table. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
kwargs Any Additional keyword arguments passed to deltalake.writer.write_deltalake method {}

to_json

to_json(self, expr, /, path, **kwargs)

Write the results of expr to a json file of [{column -> value}, …] objects.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Table The ibis expression to execute and persist to Delta Lake table. required
path str | Path The data source. A string or Path to the Delta Lake table. required
kwargs Any Additional, backend-specifc keyword arguments. {}

to_pandas

to_pandas(self, expr, /, *, params=None, limit=None, **kwargs)

Execute an Ibis expression and return a pandas DataFrame, Series, or scalar.

Note

This method is a wrapper around execute.

Parameters

Name Type Description Default
expr ir.Expr Ibis expression to execute. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
limit int | str | None An integer to effect a specific row limit. A value of None means no limit. The default is in ibis/config.py. None
kwargs Any Keyword arguments {}

to_pandas_batches

to_pandas_batches(self, expr, /, *, params=None, limit=None, chunk_size=1000000, **kwargs)

Execute an Ibis expression and return an iterator of pandas DataFrames.

Parameters

Name Type Description Default
expr ir.Expr Ibis expression to execute. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
limit int | str | None An integer to effect a specific row limit. A value of None means no limit. The default is in ibis/config.py. None
chunk_size int Maximum number of rows in each returned DataFrame batch. This may have no effect depending on the backend. 1000000
kwargs Any Keyword arguments {}

Returns

Name Type Description
Iterator[pd.DataFrame] An iterator of pandas DataFrames.

to_parquet

to_parquet(self, expr, /, path, *, params=None, **kwargs)

Write the results of executing the given expression to a parquet file.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Table The ibis expression to execute and persist to parquet. required
path str | Path The data source. A string or Path to the parquet file. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
**kwargs Any Additional keyword arguments passed to pyarrow.parquet.ParquetWriter {}
https required

to_parquet_dir

to_parquet_dir(self, expr, /, directory, *, params=None, **kwargs)

Write the results of executing the given expression to a parquet file in a directory.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Table The ibis expression to execute and persist to parquet. required
directory str | Path The data source. A string or Path to the directory where the parquet file will be written. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
**kwargs Any Additional keyword arguments passed to pyarrow.dataset.write_dataset {}
https required

to_polars

to_polars(self, expr, /, *, params=None, limit=None, **kwargs)

Execute expression and return results in as a polars DataFrame.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Expr Ibis expression to export to polars. required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
limit int | str | None An integer to effect a specific row limit. A value of None means no limit. The default is in ibis/config.py. None
kwargs Any Keyword arguments {}

Returns

Name Type Description
dataframe A polars DataFrame holding the results of the executed expression.

to_pyarrow

to_pyarrow(self, expr, /, *, params=None, limit=None, **kwargs)

Execute expression to a pyarrow object.

This method is eager and will execute the associated expression immediately.

Parameters

Name Type Description Default
expr ir.Expr Ibis expression to export to pyarrow required
params Mapping[ir.Scalar, Any] | None Mapping of scalar parameter expressions to value. None
limit int | str | None An integer to effect a specific row limit. A value of None means no limit. The default is in ibis/config.py. None
kwargs Any Keyword arguments {}

Returns

Name Type Description
result If the passed expression is a Table, a pyarrow table is returned. If the passed expression is a Column, a pyarrow array is returned. If the passed expression is a Scalar, a pyarrow scalar is returned.

to_pyarrow_batches

to_pyarrow_batches(self, expr, /, *, params=None, limit=None, chunk_size=1000000, **_)

Convert expression to PyArrow record batches.

This method ensures proper data type conversion, particularly for boolean values that come from TINYINT(1) columns and JSON columns.

to_torch

to_torch(self, expr, /, *, params=None, limit=None, **kwargs)

Execute an expression and return results as a dictionary of torch tensors.

Parameters

Name Type Description Default
expr ir.Expr Ibis expression to execute. required
params Mapping[ir.Scalar, Any] | None Parameters to substitute into the expression. None
limit int | str | None An integer to effect a specific row limit. A value of None means no limit. None
kwargs Any Keyword arguments passed into the backend’s to_torch implementation. {}

Returns

Name Type Description
dict[str, torch.Tensor] A dictionary of torch tensors, keyed by column name.

truncate_table

truncate_table(self, name, /, *, database=None)

Delete all rows from a table.

NoteIbis does not use the word schema to refer to database hierarchy.

A collection of table is referred to as a database. A collection of database is referred to as a catalog.

These terms are mapped onto the corresponding features in each backend (where available), regardless of whether the backend itself uses the same terminology.

Parameters

Name Type Description Default
name str Table name required
database str | tuple[str, str] | None Name of the attached database that the table is located in. For backends that support multi-level table hierarchies, you can pass in a dotted string path like "catalog.database" or a tuple of strings like ("catalog", "database"). None

upsert

upsert(self, name, /, obj, on, *, database=None)

Upsert data into a table.

NoteIbis does not use the word schema to refer to database hierarchy.

A collection of table is referred to as a database. A collection of database is referred to as a catalog.

These terms are mapped onto the corresponding features in each backend (where available), regardless of whether the backend itself uses the same terminology.

Parameters

Name Type Description Default
name str The name of the table to which data will be upserted required
obj pd.DataFrame | ir.Table | list | dict The source data or expression to upsert required
on str Column name to join on required
database str | None Name of the attached database that the table is located in. For backends that support multi-level table hierarchies, you can pass in a dotted string path like "catalog.database" or a tuple of strings like ("catalog", "database"). None
Back to top