上传sql如下:
SELECT * FROM os_datalake.ODS_MES_FTP_LAADS01_PRODUCT
WHERE `DATETIME` >= '2024-12-02 00:00:00'
AND `DATETIME` < '2024-12-03 00:00:00'
INTO OUTFILE "s3://datalake-archery/test_file/test_db/ODS_MES_FTP_LAADS01_PRODUCT/20241202/"
FORMAT AS CSV
PROPERTIES (
"s3.endpoint" = "http://10.210.94.150:9000",
"s3.access_key" = "as",
"s3.secret_key" = "sk",
"s3.region" = "db",
"max_file_size" = "2048MB",
"use_path_style" = "true",
"s3.ssl.enabled" = "false"
);
数据大小大约20G
每次都会报错以下类似错误:
1105, 'errCode = 2, detailMessage = (10.205.128.117)[CANCELLED]error status [IO_ERROR]failed to upload part (bucket=datalake-archery, key=s3://datalake-archery/orados_doris/os_datalake/ODS_MES_FTP_LACAP01_PRODUCT/202311/2023-11-05/3980a98d78fc4478-b04dd9066c643ee3_0.csv, part_num=26, up_load_id=YzhhOTE1ODctZTA3MC00MTZkLWExOTgtOGM0NWIwOGIwZWEwLmQ3ZDc2NWNhLWVmYmItNGRjNS1iYjFhLWZmMGEzOWU1MjMwYg): curlCode: 28, Timeout was reached, exception , error code -1\n\n\t0# doris::io::S3FileWriter::_upload_one_part(long, doris::io::UploadFileBuffer&) at /home/zcp/repo_center/doris_release/doris/be/src/common/status.h:463\n\t1# doris::io::UploadFileBuffer::on_upload() at /home/zcp/repo_center/doris_release/doris/be/src/io/fs/s3_file_bufferpool.cpp:167\n\t2# doris::ThreadPool::dispatch_thread() at /home/zcp/repo_center/doris_release/doris/be/src/util/threadpool.cpp:0\n\t3# doris::Thread::supervise_thread(void*) at /var/local/ldb-toolchain/bin/../usr/include/pthread.h:562\n\t4# start_thread\n\t5# __clone\n, complete parts 118, cur part num 395, whole parts part_numbers: 4 8 17 11 5 13 37 53 21 52 22 54 9 30 46 12 40 47 7 63 38 31 19 67 62 70 25 86 55 78 59 28 49 6 77 45 32 16 87 14 34 3 76 68 82 57 1 94 95 69 81 41 51 23 93 107 110 111 66 15 10 101 102 97 18 56 29 117 123 85 88 39 112 36 98 24 2 119 80 61 104 114 72 75 65 20 92 96 122 91 58 50 120 121 99 109 64 60 33 79 108 103 100 116 115 71 89 74 83 90 73 118 113 42 27 105 106 84'
Doris到10.210.94.150的9000端口是通的,但是上传小批量数据(200M)时可以上传成功,有大佬指点迷津吗?