A production-grade HBase ORM library that makes accessing HBase clean, fast and fun (Can also be used as Bigtable ORM)
increment()
methods on DAO class.Connection
object as a parameter.AbstractHBDAO
: New constructor that takes custom HBObjectMapper
get(List<Get>)
to getOnGets
to improve clarityDeserializationException
and SerializationException
moved to package com.flipkart.hbaseobjectmapper.codec.exceptions
getHBFields
to getHBColumnFields
null
yourself in your composeRowKey
implementation.Get AbstractHBDAO::getGet(rowKey)
and AbstractHBDAO::get(Get)
methods to enable advanced/customized read access patterns on HBase. (e.g. "get all versions in the last 5 days", "get all versions between last-sunday and yesterday" etc.)AbstractHBDAO(Configuration, Codec)
.HBObjectMapper.rowKeyToIbw
to HBObjectMapper.toIbw
(This method wasn't doing anything specific to row key. To compose row key of your bean-like class, use HBObjectMapper.getRowKey
).getColumnFamilies()
method from HBObjectMapper
and AbstractDAO
classes (Use the alternative getColumnFamiliesAndVersions()
method instead).AbstractHBDAO
class now implements Closeable
interface. You can now use your DAO classes in try-with-resources statement (and not worry about closing the underlying connection).@HBTable
annotation now takes column families and max number of versions. As an implication:
AbstractHBDAO::getColumnFamiliesAndVersions
method. Deprecated AbstractHBDAO::getColumnFamilies
method.@HBColumn
annotation now must be pre-declared in @HBTable
String
, even when rowkey was declared as some other type (e.g. Long
or custom class) in your bean-like entity class. This has now been fixed.Codec
(be it the default codec or your own) into a CodecException
BestSuitCodec
) has been supporting a flag "serializeAsString"
(since v1.5). This String
flag has now been made into a constant BestSuitCodec.SERIALISE_AS_STRING
(to avoid spelling errors)HBRecord
class containing member variables that are collection of custom classes (e.g. List<SomeClass>
, Map<String, SomeOtherClass>
). Added detailed test cases to prevent regression of this issue in upcoming versions.HBColumn
and HBColumnMultiVersion
annotationsInteger id
in Citizen
class differently from field Integer id
in Employee
class)
codecFlags
in @HBColumn
and @HBColumnMultiVersion
annotations.Boolean
, Short
, Integer
, Long
, Float
, Double
, String
and BigDecimal
. From this version on, you may customize serialization of any data type, including these. This allows you much higher level of control.serializeAsString
from @HBColumn
and @HBColumnMultiVersion
annotations.
serializeAsString=true
as a codecFlag
for the class field. The default codec (BestSuitCodec
class) handles this flag. If you are using your own codec, you'll have to handle it explicitly.Codec
interface.valueToByteArray
and rowKeyToIbw
now accept input row key as Serializable & Comparable
, instead of just Serializable
(in line with signatures of rest of the methods). Internally, writeValueAsResult
and writeValueAsPut
methods now handle objects representing row key as Serializable & Comparable
, instead of just Serializable
.Comparable
with itself and Serializable
. In earlier versions of this library, row key could only be mapped as a String
. HBRecord
interface is now a generic. AbstractHBDAO
class now takes two parameters (row key type and entity type), instead of one.JacksonJsonCodec
can now take your own instance of ObjectMapper
delete(HBRecord[])
to delete(List<? extends HBRecord<R>> objects)
to avoid conflict with other generic method with same type erasure (conflict arose only in this version since row key is now a generic)Util
methods to convert String
to ImmutableBytesWritable
and vice-versa into HBObjectMapper
class (to keep serialization logic in sync with your codec)writeValueAsRowKeyResultPair
methods. Alternatively, use existing methods getRowKey
and writeValueAsResult
.