Timescale Cloud: Performance, Scale, Enterprise
Self-hosted products
MST
Amazon SageMaker AI is a fully managed machine learning (ML) service. With SageMaker AI, data
scientists and developers can quickly and confidently build, train, and deploy ML models into a production-ready
hosted environment.
This page shows you how to integrate Amazon Sagemaker with a Timescale Cloud service.
To follow the steps on this page:
Create a target Timescale Cloud service with time-series and analytics enabled.
You need your connection details. This procedure also works for self-hosted TimescaleDB.
- Set up an AWS Account
Create a table in Timescale Cloud service to store model predictions generated by SageMaker.
Connect to your Timescale Cloud service
For Timescale Cloud, open an SQL editor in Timescale Console
. For self-hosted, use
psql
.For better performance and easier real-time analytics, create a hypertable
Hypertables are PostgreSQL tables that automatically partition your data by time. You interact with hypertables in the same way as regular PostgreSQL tables, but with extra features that makes managing your time-series data much easier.
CREATE TABLE model_predictions (time TIMESTAMPTZ NOT NULL,model_name TEXT NOT NULL,prediction DOUBLE PRECISION NOT NULL) WITH (tsdb.hypertable,tsdb.partition_column='time');If you are self-hosting TimescaleDB v2.19.3 and below, create a PostgreSQL relational table
, then convert it using create_hypertable. You then enable hypercore with a call to ALTER TABLE.
Create a SageMaker Notebook instance
- In Amazon SageMaker > Notebooks and Git repos
, click
Create Notebook instance
. - Follow the wizard to create a default Notebook instance.
- In Amazon SageMaker > Notebooks and Git repos
Write a Notebook script that inserts data into your Timescale Cloud service
When your Notebook instance is
inService,
clickOpen JupyterLab
and clickconda_python3
.Update the following script with your connection details, then paste it in the Notebook.
import psycopg2from datetime import datetimedef insert_prediction(model_name, prediction, host, port, user, password, dbname):conn = psycopg2.connect(host=host,port=port,user=user,password=password,dbname=dbname)cursor = conn.cursor()query = """INSERT INTO model_predictions (time, model_name, prediction)VALUES (%s, %s, %s);"""values = (datetime.utcnow(), model_name, prediction)cursor.execute(query, values)conn.commit()cursor.close()conn.close()# Example usageinsert_prediction(model_name="example_model",prediction=0.95,host="<host>",port="<port>",user="<user>",password="<password>",dbname="<dbname>")
Test your SageMaker script
Run the script in your SageMaker notebook.
Verify that the data is in your service
Open an SQL editor and check the
sensor_data
table:SELECT * FROM model_predictions;You see something like:
time model_name prediction 2025-02-06 16:56:34.370316+00 timescale-cloud-model 0.95
Now you can seamlessly integrate Amazon SageMaker with Timescale Cloud to store and analyze time-series data generated by machine learning models. You can also untegrate visualization tools like Grafana or Tableau with Timescale Cloud to create real-time dashboards of your model predictions.
Keywords
Found an issue on this page?Report an issue or Edit this page
in GitHub.