This content originally appeared on DEV Community and was authored by Wesley Cheek
After a lot of struggle doing this, I finally found a simple way.
We can write and read Tensorflow
and sklearn
models/pipelines using joblib
.
Local Write / Read
from pathlib import Path
path = Path(<local path>)
# WRITE
with path.open("wb") as f:
joblib.dump(model, f)
# READ
with path.open("rb") as f:
f.seek(0)
model = joblib.load(f)
We can do the same thing on AWS S3 using a boto3
client:
AWS S3 Write / Read
import tempfile
import boto3
import joblib
s3_client = boto3.client('s3')
bucket_name = "my-bucket"
key = "model.pkl"
# WRITE
with tempfile.TemporaryFile() as fp:
joblib.dump(model, fp)
fp.seek(0)
s3_client.put_object(Body=fp.read(), Bucket=bucket_name, Key=key)
# READ
with tempfile.TemporaryFile() as fp:
s3_client.download_fileobj(Fileobj=fp, Bucket=bucket_name, Key=key)
fp.seek(0)
model = joblib.load(fp)
# DELETE
s3_client.delete_object(Bucket=bucket_name, Key=key)
This content originally appeared on DEV Community and was authored by Wesley Cheek
Wesley Cheek | Sciencx (2022-04-19T02:09:37+00:00) Save/Load Tensorflow & sklearn pipelines from local and AWS S3. Retrieved from https://www.scien.cx/2022/04/19/save-load-tensorflow-sklearn-pipelines-from-local-and-aws-s3/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.