I've been learning about in-memory computing (both Oracle and SAP Hana) and this is looking like a major revolution, no ? Making the architecture of the 90's a relic of the past ? Just trying to understand.
In this SAP presentation (first 5 minutes), Prof. Hasso Plattner talks about the revolution that is Hana. Because it's an in-memory database, performance is fast as lightning, thus opening up many possibilities because of the speed. Attached is Hana's .pdf
After researching this a bit (including Oracle, here), I can see the value of in-memory computing: faster, reduces complexity, reduces data footprint, etc.
Prof Plattner also talks about eliminating OLAP databases (and data warehouses, which I suppose is OLAP), because of the speed. So is this true too ? He says to bring analysis back to the OLTP system. I also heard Hana can scan 2 B records per second . . . if so, wow.
So I'm old school Oracle, came at the beginning of client server, three-tiered, etc. with a little COBOL thrown in back in the day. Can write awesome SQL statements and did plenty of PL*SQL programming, and lots of performance tuning. But it's looking to me like this is a new world and I better get a handle on it.
We have a small d.b. so size is not even a consideration. Like 60,000 main master records (students in a higher education setting).
Overall, this seems to be a major change, one that is bursting old ways of doing things, at least according to Prof. Plattner.
Doesn't it seem that any shop worth its salt should be investigating this ? And for the dinosaur development manager who says, "we're just fine, don't worry about this", what should I say ? (This is the same development manager who is spending 100% of his time on patching a 20 year old system, including many many processes that pre-aggregate the data).
So wondering
- the extent to which in-memory is being implemented
- are OLAP / data warehouse systems going to be obsolete ?
- other thoughts for an old dog . . .