This roadmap is still current, though progress has been incremental. A lot of the work so far this year has just been modernizing the Immutable.js codebase, though some incremental work on performance is already happening. We haven't yet cut to a new branch, which is why you're not seeing much specifically on the v4.0 branch.
The performance penalties you see can range a lot depending on how you use it. In a lot of the places I've seen Immutable data structures used with high performance, there's a good understanding of what operations are slow and fast.
For example, operating over an Immutable structure in a loop is slightly slower than using it's mutable brethren, about 15% slower by my measurements because of crawling a tree rather than a list.
Creating and destroying data structures also depends a lot on what operations you're doing. David Nolen, in one of his presentations illustrated how building an Immutable list by pushing in numbers 1 to 1,000,000 could actually end up being faster than doing the same work with a plain array. Why? Because of course under the hood, an array is also creating and destroying memory buffer space, and the shape of the memory used by the Immutable data structures allowed for more reuse and ended up being slightly faster. Crazy!
However there are some operations that aren't O(1) in Immutable, or are naturally significantly slower. Anything that results in re-indexing an Immutable list, for example, is O(N). Or the worse of them, toJS() which performs a recursive deep copy of the entire data structure. If you want to do a performance audit, just look through your codebase for toJS(), and try to remove them.
My issue is that I can't throw immutable.js into any code without receiving performance penalties. I'm trying to convince some of our dev teams to use immutable data structures.
At the end of the day, Immutable data structures are not designed to be faster than their mutable brethren in isolation. There will always be a performance overhead to using them, even after we make it through more of our 2016 roadmap of perf improvements. Instead, immutability enables new kinds of techniques for whole-program optimization, like memoization, that simply can't be done with mutable data.
Unfortunately you can't just throw these things into any code and expect speed gains, but instead need to design for the opportunities it presents, as React already has.
And serialization in server-side-rendering projects. What's the status here?
The idea is if you're using Immutable.js on the server and the client, or you want to stash it away in localstorage or something like that, rather than converting to JSON and then reparsing, we should have different ways to represent this. There's been some experimentation here, and there are some great other projects which support this like Transit. David Nolen wrote a great post about this a while ago (swannodette.github.io/2014/07/30/hijacking-json).