S3Scanner Versions Save

Scan for misconfigured S3 buckets across S3-compatible APIs!

v3.0.4

7 months ago

What's Changed

Full Changelog: https://github.com/sa7mon/S3Scanner/compare/v3.0.3...v3.0.4

v3.0.3

7 months ago

Changes

chore

bugfix

refactor

feature

Full Changelog: https://github.com/sa7mon/S3Scanner/compare/v3.0.2...v3.0.3

v3.0.2

8 months ago

Changes

bugfix

feature

refactor

New Contributors

Full Changelog: https://github.com/sa7mon/S3Scanner/compare/v3.0.1...v3.0.2

v3.0.1

9 months ago

What's Changed

Full Changelog: https://github.com/sa7mon/S3Scanner/compare/v3.0.0...v3.0.1

v3.0.0

9 months ago

2.0.2

2 years ago

Changes

  • Fixes #122 - CVE-2021-32061: Path Traversal via dump of malicious bucket

2.0.1

3 years ago

Quick update to 2.0.0 to improve endpoint validation and allow support for GCP. Also I goofed and broke the Pip package, so this will remedy that.

Changes

  • Improve endpoint validation
  • Add automated tests to validate 3rd party endpoints

2.0.0

3 years ago

This is almost a complete re-write of the tool including scanning logic and output and adds a good amount of new functionality. The code is now much cleaner and simpler than before.

Changes

  • ‼️ Added checks for "dangerous" permissions: Write, WriteACP
  • ✏️ Simplified the output not have different formats for file and console output. Everything is now just output to stdout in a uniform way to allow easy parsing with grep/awk/etc
  • 🔭 Support added for non-AWS S3-compatible APIs. This was done in a generic way to avoid having to include API-specific code in the tool and update it when the APIs inevitably change or break
  • 🐍 Pip package created and distributed
  • 🐳 Built and pushed a Docker image to Docker Hub
  • 📈 Increased overall test coverage to ~90%
  • ⚡️ Added support for multi-threaded scanning and dumping
  • 💾 Added support for "resume-able" dumping. If an object has already been downloaded, it will be skipped unless the sizes differ
  • 🔎 Added Travis CI tests to verify functionality on Python 3.6-3.9

Known Issues / Future Work

  • Currently, non-AWS endpoints are only scanned for anonymous permissions. Testing is needed to see if credential scans work and if the permissions match AWS structure.
  • When dumping a bucket, the tool will check to see if each file has already been downloaded. If it has, the file will be skipped unless the size of the local and remote files don't match. In the future, the user should be given a choice to re-download these files.
  • Measure user desire for other output formats (i.e. csv/json/sqlite)

1.0.0

6 years ago

Changes

  • Major: The checkBucket() function was changed to use boto to check for buckets instead of GET'ing the page out on the web. This is better for several reasons:
    • Buckets that are not open to the public, but are listable or are only open to authenticated users are now properly found
    • Regions are no longer needed
  • Due to the change to use boto, AWS credentials are nearly required. The tool will still run without them, but results will be incredibly inaccurate. Users will receive a warning if credentials are not found.
  • The buckets.txt file now contains only bucket names instead of bucket:region
  • The screen output now says whether or not the bucket was found. The concept of 'open' vs 'closed' buckets no longer exists. This may change in the future.
  • The newly added checkBucketWithoutCreds will now issue a maximum of 2 requests to check if a bucket exists. This helps ease the issue of 503's being returned intermittently.
  • When dumping a bucket, the user can see that it's being dumped but doesn't get the ugly output. This is to allow the user to still cancel with Ctrl+C. A dumping progress feature will most likely be implemented in the future.
  • If a bucket doesn't allow listing of it's contents, the size will be "AccessDenied" or "AllAccessDisabled".

Features

  • Added: getAcl() to try to get the ACLs associated with found buckets. They're currently only output to the screen.
  • Removed: The --default-region argument. The new way of checking if buckets exist doesn't need the bucket's region and neither do any of the other functions. We're region-free now baby
  • Added: The --version argument. Pretty self-explanatory
  • Removed: The --include-closed argument. Now that the tool is more self-aware of the permissions on a bucket, it can be hard to determine what makes a bucket "open" or "closed". Disabling for now until I determine a better way to handle it.

Bugs

  • #33 - Public-ness is not accurate - Using boto now fixes this
  • #32 - Buckets in bucket-stream form return 'not found' - s3scanner.py now parses the bucket name out and ignores the region
  • #43 - Cancelling bucket dumping doesn't work - Implicitly fixed because now the dump process is not in the foreground.

Dev

  • The travis-ci config was changed to allow for 4 total jobs: run Python 2.7 and 3.6 and run each version with and without AWS credentials configured.
  • A testing requirement matrix was created in the wiki to allow for easier tracking of test coverage.
  • Added tests:
    • test_checkBucketInvalidName()
    • test_checkAwsCreds()
    • test_checkBucketName()
    • test_checkBucketWithoutCreds()
  • Removed tests:
    • test_checkIncludeClosed()
    • test_outputFormat()

0.2.0

6 years ago

Changes

This release adds some really cool functionality and added stability. Thanks to @vysec, there's now a --list argument to enable saving bucket listings to file. Currently, this takes a long time if there are a lot of files in the bucket - in the near future I'm going to be looking at adding multi-threading/processing to speed the whole process up.

Features

  • #23 - The --list argument was added.
    • A warning is now given if properly configured AWS credentials are not found. Many open buckets will show up as closed if creds aren't given.

Bugs

  • #34 - GetBucketSize() doesn't get the size of buckets - @vysec added a workaround for now. Thanks!
  • #31 - Bucket names with punctuation break scanning - Added checks to verify candidate bucket names conform to the Amazon S3 bucket name conventions.
  • #29 - Remove test.txt - Removed test.txt which served no purpose. (?)

Dev

  • The function test_setup was added as a noobish way to do test setup. Probably a better way to do it with pytest.
  • #35 - Test: test_listBucket()
  • Added test scenarios to existing test to avoid #31 regressions.