By Cat Yong
For Teradata, what UDA essentially is, is a data architecture made up of three significant components – Hadoop, big data discovery and the enterprise data warehouse (EDW). The UDA organises data and manages user queries so that it appears to users as though all the data is sitting in one large pool.
Morrison described the three components, “Hadoop is good at storing large amounts structured or unstructured data, very cheaply. It is not good at high concurrency work. Aster is our transitory big data discovery platform that can access data from Hadoop, and it has a bunch of unique, big data analytics capabilities. Our EDW is built for high workloads, security and lots of users.”
The EDW also come with service level agreements (SLAs), as it is the core component responsible for operationalising data ie. a marketing campaign in the EDW, that needs to be delivered to relevant systems by certain times in a day.
“Let’s say call centre systems need to get sales and marketing leads by 6am every day. If you miss that time window, it is too late. Because when people start calling in, it’s already too busy. So it puts a heavy load on the system, which means SLAs are incredibly important.
“Otherwise, you have a call centre sitting there, with nothing to do,” Morrison explained.
Future-proofing logical data models
With so much data and types of data being generated, the architecture upon which a business organises data, has never been more important. Data models – logical and physical – are helping businesses do this, and have been for the past three decades at least, said Morrison.
Logical data models could be described as types of blueprints, from which physical data models are built. One very obvious logical and physical data model would be Teradata’s Unified Data Architecture.
“For the banking industry for example, we are up to version 13 of our logical data model. Basically, it is a design for how to organise data for a bank. And we have a council comprising Teradata employees and customers that meet regularly to work out what next to add into the model.”
This is a continuous process and the model could be updated years before an industry regulatory requirement actually comes around, Morrison also claims.
“Whenever our customers use our data model, we know it can handle pretty much any data set, any query that the bank will have.”
“I haven’t come across a customer yet, in all of Asia Pacific, who has found a business application that our data model is not already prepared for,” said Morrison.