Just to throw another method onto the fire -- modern ECMAScript only:
var result = [];
for (var i of yourData) result = result.concat(i.ss);
I favor the for loop approaches -- the reduce/map/foreach methodologies may seem 'simpler' but in reality the overhead of the function callbacks and the functions themselves can end up two to five times slower than just brute force looping. More so in modern ECMAScript where for/of is now an (ridiculously overdue) addition to the language.
oh, and depending on the browser engine this may be faster -- and does use less memory across all of them.
var result = [];
for (var i of originalArr) result.push.apply(result, i.ss));
It's strange, but it works. The assignment of "result = result.concat" means memory allocation for the result of the concat, and the overhead of releasing the old value of the result variable. A push just allocates a new pointer/space for the new element on the existing array instead of a full assignment/release of the entire array. Faster in some browsers, slower in others, less memory thrashing. (though potential for higher fragmentation)
If I was deploying for just v8 -- such as an electron or nw.js app -- I'd go with the latter. If making a website where Edge/Chakra is still the slowpoke in the room I'd go with the former.