I want to access google cloud storage as in the code below.
JavaScript
x
11
11
1
# amazon s3 connection
2
import s3fs as fs
3
4
with fs.open("s3://mybucket/image1.jpg") as f:
5
image = Image.open(f).convert("RGB")
6
7
8
# Is there an equivalent code like this GCP side?
9
with cloudstorage.open("gs://my_bucket/image1.jpg") as f:
10
image = Image.open(f).convert("RGB")
11
Advertisement
Answer
You’re looking for gcsfs
. Both s3fs and gcsfs are part of the fsspec
project and have very similar APIs.
JavaScript
1
7
1
import gcsfs
2
3
fs = gcsfs.GCSFileSystem()
4
5
with fs.open("gs://my_bucket/image1.jpg") as f:
6
image = Image.open(f).convert("RGB")
7
Note that both of these can be accessed from the fsspec interface, as long as you have the underlying drivers installed, e.g.:
JavaScript
1
12
12
1
import fsspec
2
3
with fsspec.open('s3://my_s3_bucket/image1.jpg') as f:
4
image1 = Image.open(f).convert("RGB")
5
6
with fsspec.open('gs://my_gs_bucket/image1.jpg') as f:
7
image2 = Image.open(f).convert("RGB")
8
9
# fsspec handles local paths too!
10
with fsspec.open('/Users/myname/Downloads/image1.jpg') as f:
11
image3 = Image.open(f).convert("RGB")
12
fsspec is the file system handler underlying pandas and other libraries which parse cloud URLs. The reason the following “just works” is because fsspec is providing the cloud URI handling:
JavaScript
1
4
1
pd.read_csv("s3://path/to/my/aws.csv")
2
pd.read_csv("gs://path/to/my/google.csv")
3
pd.read_csv("my/local.csv")
4