[java] When should we implement Serializable interface?

The answer to this question is, perhaps surprisingly, never, or more realistically, only when you are forced to for interoperability with legacy code. This is the recommendation in Effective Java, 3rd Edition by Joshua Bloch:

There is no reason to use Java serialization in any new system you write

Oracle's chief architect, Mark Reinhold, is on record as saying removing the current Java serialization mechanism is a long-term goal.


Why Java serialization is flawed

Java provides as part of the language a serialization scheme you can opt in to, by using the Serializable interface. This scheme however has several intractable flaws and should be treated as a failed experiment by the Java language designers.

  • It fundamentally pretends that one can talk about the serialized form of an object. But there are infinitely many serialization schemes, resulting in infinitely many serialized forms. By imposing one scheme, without any way of changing the scheme, applications can not use a scheme most appropriate for them.
  • It is implemented as an additional means of constructing objects, which bypasses any precondition checks your constructors or factory methods perform. Unless tricky, error prone, and difficult to test extra deserialization code is written, your code probably has a gaping security weakness.
  • Testing interoperability of different versions of the serialized form is very difficult.
  • Handling of immutable objects is troublesome.

What to do instead

Instead, use a serialization scheme that you can explicitly control. Such as Protocol Buffers, JSON, XML, or your own custom scheme.