Simple & Efficient data access for Scala and Scala.js
Besides some libraries upgrades, this version allows cached requests to be logged.
Fetch execution 🕛 0,08 seconds
[Round 1] 🕛 0,02 seconds
[Batch] From `Users` with ids List(1, 2) cached false 🕛 0,02 seconds
[Round 2] 🕛 0,00 seconds
[Fetch one] From `Users` with id 2 cached true 🕛 0,00 seconds
[Round 3] 🕛 0,01 seconds
[Fetch one] From `Users` with id 4 cached false 🕛 0,01 seconds
[Round 4] 🕛 0,01 seconds
[Batch] From `Posts` with ids List(1, 2, 3) cached false 🕛 0,01 seconds
[Batch] From `Users` with ids List(3) cached false 🕛 0,01 seconds
[Batch] From `Users` with ids List(1, 2) cached true 🕛 0,00 seconds
Thanks @a-khakimov
@47erbot, @a-khakimov and @fedefernandez
This version brings some libraries upgrades:
@47erbot, @calvellido, @cb372, @davesmith00047, @fedefernandez, @israelpzglez, @juanpedromoreno, @matsluni and @sloshy
Fetch 3.1.0 is mostly the same as Fetch 3.0.0, but rolls back a behavioral change from 3.0.0. It does not introduce any breaking changes and the API remains the same.
In Fetch 3.0.0, fixing a bug introduced with the release of Cats 2.7.0 a few months back lead us to believe that it would be better for the library going forward to not explicitly guarantee auto-batching fetch requests when using Cats syntax methods such as .traverse
or .sequence
. The reason this bug got introduced was because fetch auto-batching behavior depended on certain Cats implementation details that unexpectedly changed. We put together a quick fix shortly after the release of 3.0.0, initially for another 2.x series release, but we have decided to continue the 3.x line instead. Documentation has been updated to reflect that now on versions >=3.1.0 you will still get implicit batching where possible, though it still might go away in the future.
In short, you will continue to get automatic batching on calls to .sequence
and .traverse
with Fetch 3.1.0 and going forward, in addition to the new syntax we added previously which will stay for backwards compatibility.
In the meantime, please follow 47 Degrees on Twitter and stay up-to-date on our blog for future updates as we have exciting news for Fetch users coming soon on future, related developments.
@47erbot and @sloshy
NOTE: This release has been superseded by 3.1.0 shortly after, which rolls back the underlying behavioral changes described below but keeps the new syntax additions. Please update to 3.1.0 instead, to minimize necessary changes from the 2.x series.
The original release notes are below:
Fetch has been updated to 3.0.0. This release contains one behavioral breaking change, and several upstream library upgrades (Cats 2.7, Cats Effect 3.3.4) as well as patch releases of http4s (0.23.8)
PSA: If you upgrade to Cats 2.7.0 without upgrading to Fetch 3.0.0, this can affect behavior of auto-batching. We strongly recommend you upgrade to Fetch 3.x if you are on Cats 2.7.0, and follow migration instructions below to ensure batches are handled properly.
The breaking change involves how batches are created. Previously, if you ran functions like .traverse
or .sequence
to produce a list of fetch results, Fetch would implicitly try to batch requests together. For Fetch 3.0.0 we have decided to provide explicit batching syntax instead of guaranteeing implicit batching of fetches. This may be further expanded upon in future releases.
Currently, batching with Applicative
syntax can still occur, but batching with traverse
, sequence
, or anything that internally uses map2Eval
is unsupported. This may change in future releases as more explicit batching syntax is added.
For example, assume you have a function fetchName
which fetches a user's name given an integer ID. You want to batch multiple requests to save on bandwidth, so you write the following code:
import cats.effect.{Concurrent, IO}
//Fetches a user's name by ID
def fetchName[F[_]: Concurrent](id: Int): Fetch[F, String] = ???
//Attempts to batch a request for users 1, 2, and 3
val batchedRequests = List(1, 2, 3).traverse(id => fetchName[IO](id))
In older versions of Fetch this will batch requests so that it will try to get all three at once, but in the current version (3.0.0) this behavior has been made explicit. You should now opt-in to this behavior by either calling Fetch.batchAll
or by using special syntax, like so:
//You can explicitly batch requests without syntax using Fetch.batchAll
//It uses varargs so you can manually put in any number of fetches, or expand a list with `: _*` syntax
val explicitBatchedRequest = Fetch.batchAll(fetchName[IO](1), fetchName[IO](2), fetchName[IO](3))
import fetch.syntax._
//batchAll syntax works similarly to `sequence` and operates on any `Seq[Fetch[F, A]]`
val batchedRequestSyntax = List(1, 2, 3).map(id => fetchName[IO](id)).batchAll
//batchAllWith syntax is similar, but works like `traverse` instead
val batchedRequestWithSyntax = List(1, 2, 3).batchAllWith(id => fetchName[IO](id))
In your code, if you would like to retain batching in places where you use traverse
or sequence
already, you can simply replace them with batchAllWith
and batchAll
syntax respectively and retain the old functionality. Any traverse
or sequence
calls in Fetch >=3.0.0 are not guaranteed to be batched, and will be sequenced like any other flatMap
chain or for-comprehension.
@47erbot, @fedefernandez, @juanpedromoreno and @sloshy
@47erbot, @jordiolivares and @juanpedromoreno
@47erbot, @Daenyth, @jordiolivares and @juanpedromoreno
⚠️ This is a major release, which is not binary compatible with the 1.x release series
@47erbot, @Daenyth, @benderpremier and @juanpedromoreno
@47erbot, @BenFradet, @adpi2, @alejandrohdezma, @juanpedromoreno, @scala-steward and @sloshy
.github
release and SBT build improvements (#335) @alejandrohdezma@47erbot, @BenFradet, @alejandrohdezma and @scala-steward
@47erbot, @AntonioMateoGomez, @BenFradet, @franciscodr, @github-actions, @github-actions[bot], @juanpedromoreno and @scala-steward