Azure Blob数据源支持

Viewed 17

参考文档:
https://doris.apache.org/zh-CN/docs/3.x/data-operate/import/data-source/azure-storage

版本 3.1.2

问题1

使用如下配置:

SELECT * FROM S3
(
    "uri" = "s3://your_bucket_name/s3load_example.csv",
    "format" = "csv",
    "provider" = "AZURE",
    "s3.endpoint" = "StorageAccountA.blob.core.windows.net",
    "s3.region" = "westus3",
    "s3.access_key" = "<your-ak>",
    "s3.secret_key" = "<your-sk>",
    "column_separator" = ",",
    "csv_schema" = "user_id:int;name:string;age:int"
);

报错:

ERROR 1105 (HY000): errCode = 2, detailMessage = Can not build s3(): No enum constant
org.apache.doris.analysis.StorageBackend.StorageType.Azure

问题2

hive catalog, hive 部署在azure blob上,使用abfss文件系统路径
创建catalog:

CREATE CATALOG `trc` PROPERTIES (
"use_path_style" = "true",
"type" = "hms",
"hive.metastore.uris" = "xxxxx",
"azure.secret_key" = "*XXX",
"azure.region" = "westus2",
"azure.endpoint" = "https://blobaccount.blob.core.windows.net",
"azure.access_key" = "blobaccount"
);

查询报错

ERROR 1105 (HY000): errCode = 2, detailMessage = get file split failed for table: ta, err: org.apache.doris.common.UserException: errCode = 2, detailMessage = Invalid scheme: abfss://blobcontainer@blobaccount.dfs.core.windows.net/hive/warehouse/external/test.db/ta
1 Answers