Hadoop Cos Versions Save

hadoop-cos(CosN文件系统)为Apache Hadoop、Spark以及Tez等大数据计算框架集成提供支持,可以像访问HDFS一样读写存储在腾讯云COS上的数据。同时也支持作为Druid等查询与分析引擎的Deep Storage

v8.3.3

6 months ago

Feature:

  • Optimized random read performance significantly.

v8.3.2

6 months ago

Features:

  • Improved the frequency control supports for the List operation.

v8.3.1

11 months ago

Fixes:

  • revert the 26f29ecdb2eb6a4eb370505f529fed47b2c5918f to fix the infinite invalid retries.

v8.3.0

1 year ago

Features:

  • fix l5 update bug
  • support custom-defined credentials provider
  • avoid null pointer by init cos native client outside but not init ranger client
  • reduce the max connection number from 2048 to default 1024
  • support cosn://bucket/ POSIX way to query metadata acceleration bucket

v8.2.6

1 year ago

Improves:

  • Remove logic to recursively check directories to improve the create operation performance;

Bug Fixes:

  • Fix the seekable write operation bug.

v8.2.4

1 year ago

Features:

  • fix close() add super.close(), avoid not returning to cache but close, next time occur fs already close exception
  • fix close() which now only releases resources once.
  • fix listStatus() symbolic calling.

ps: typically only use Hadoop-cos and cos_api-bundle pkg, if you need to use the random write feature add the ofs-sdk-definition pkg, if you need to use client-side encryption add the tencentcloud-sdk-java-kms and tencentcloud-sdk-java-common pkg.

v8.2.3

1 year ago

Features:

  • support independent bucket configuration;
  • support client-side encryption;
  • change the class load of seekable write, avoid classNoDefine of Seekable class.
  • add list parts to double-check upload part conflict;

ps: typically only use Hadoop-cos and cos_api-bundle pkg, if you need to use the random write feature add the ofs-sdk-definition pkg, if you need to use client-side encryption add the tencentcloud-sdk-java-kms and tencentcloud-sdk-java-common pkg.

v8.2.2

1 year ago

fix:

  • refactor: Modify the internal abort interface to adapt the hadoop-3.3.0 or later.