Different Stakeholders, Different Views: Why Database Management Requires a Systematic Approach

What does your company’s approach to database management look like? You should always have an answer to that question.

Suppose that, at the end of a conference call, someone from the executive leadership team asks you and your boss to stay on for a few minutes.

“I saw a TED talk last week about databases,” the exec says, “I learned that it’s important to have a systematic approach to database management. So, what’s ours?”

Fortunately, you’ll have an answer.

“I’m glad you asked,” you say. “Let me walk you through it.”

Database management — Different stakeholders, different needs

If data is the new oil then you’d better take good care of yours, and be taking the latest DBMS trends into consideration.

At a minimum, database management means looking out for the database professionals in the company like yourself. You store your data safely and make sure it’s available when the business needs it; in other words, you keep the lights on.

But as databases grow in size and in value, database management goes beyond that by helping the people who develop new applications for your company’s data. It means avoiding performance problems like slow queries when analysts and business managers across the organization are trying to study the data.

How do you manage databases when the needs are so different?

For developers

“First,” you tell the exec, “we look at the needs of our database developers. From their perspective, database management is all about designing, developing and maintaining the data structures and functional code that form part of our applications. We run those applications against the database to maximize business advantage, drive revenue and remain competitive.”

Developing databases

Developers pride themselves on writing efficient, high-quality code that will perform well in production. They spend a lot of time designing and modeling efficient data structures, thinking as far down the road as possible so they don’t paint themselves into a corner. They value tools that automate their work and allow them to share code, scripts, artifacts and standards with other developers. They also need an effective way to optimize SQL executions in the database to ensure applications are highly responsive.

What has their hair on fire is the variety of databases they work on. They work with commercial relational databases such as SQL Server, Oracle and SAP; open source relational databases like MySQL and PostgreSQL; and NoSQL databases such as Cassandra and MongoDB. Mind you, they like challenges and novelty, but they don’t like them at quarter to five on a Friday afternoon when something isn’t working right. They want tools that solve their problems on multiple platforms.

For them, database management also covers the way they test and review their code. That includes automated unit testing and code reviews so they know their code is functionally correct, debugged, reviewed and validated.

Database DevOps CI/CD

Database development is different from application development in one important way: you can’t just undo a change and go back to how things were before. Why? Because the database contains not only schemas and code you may wish to undo, but also continuous data transactions that are difficult to undo to a particular point in time. So, while application developers can automate their changes to the nth degree, database developers make most of their changes manually, through carefully orchestrated scripts that DBAs deploy into production.

Here, the goal is to give database developers the same access to automated processes that application developers have. That means automated database code testing, analysis and validation within existing tools for continuous integration and continuous delivery.

Database DevOps CI/CD brings change management into parity for both application and database developers. The result is a smoother DevOps pipeline that enables rapid delivery of changes and enables the entire business to be more agile.

For business analysts

“Next,” you say, “database management extends to those users who need to extract useful insights from various data sources so the business can make informed decisions.”

Preparing data

Data tends to go stale pretty quickly, so part of good database management is getting it into the models and dashboards of analysts as soon as possible. But there are four main roadblocks to data preparation:

  • The wide variety of data sources covers both structured and unstructured data in platforms as diverse as Oracle, SAP, MySQL and Cassandra.
  • To pull the data, analysts may use everything from custom, vendor-supplied tools to ordinary SQL queries.
  • In regulated industries, data governance and compliance impose limits on what analysts can access.
  • Because they have to do most of the data preparation work, analysts spend too much time accessing and combining data, and not enough time analyzing and understanding what the data is saying.

Generally, analysts would rather study data than prepare it. Smart database professionals try to shorten the time to usefulness by automating tasks like cross-connection querying, data quality assessment, data profile analysis, data manipulation and reporting.

For the database administrators

“Then,” you tell the exec, “we think about the nuts-and-bolts work of database management from the perspective of our DBAs.”

Monitoring database performance

If database performance is poor, it can affect application users. In that case, it hardly matters how good the data is. Administrators are responsible for uptime, availability and reliability, which means they have to continually monitor the database to watch for small problems and ensure they don’t lead to large outages.

Here, too, cross-platform tools are useful as companies diversify their database landscape — including SQL Server, Oracle, SAP, MongoDB and Cassandra — to take advantage of relative strengths and avoid vendor lock-in. The goals of database monitoring are to obtain a complete picture of the environment, whether on-premises or in the cloud, and to find and fix performance problems early.

Underpinning the databases you’ll find the operating system — Windows or Linux — and virtual machines, two more layers where problems may hide. When top-to-bottom performance monitoring is automated, database professionals can spend more of their time on high-value tasks like analyzing SQL and tuning queries.

Administering databases

There’s nothing glamorous about this. Databases need reorganizing, indexes need maintaining and storage needs reallocating. The goal here is to give the admins what they need to manage the day-to-day stuff so that the day-to-day stuff doesn’t manage the admins. The fewer and more versatile the tools they can use, the better for their productivity and effectiveness across database platforms as wide-ranging as Oracle, SQL Server, MySQL and PostgreSQL.

Routine admin tasks take up a lot of valuable time that database professionals cannot spare, especially with so many databases to manage. Task automation, such as for script executions and health checks, helps DBAs work more efficiently and reduces the mistakes associated with human error.

Databases also need protection, preferably with minimal effort and risk. That means backup and recovery software capable of restoring database access in minutes without cramping backup windows or affecting user productivity.

Replicating databases

Another way to get the most value out of databases is to make them accessible in multiple places at once through replication.

Replication benefits IT by making production databases highly available and recoverable, whether they are on-premises or in the cloud. It allows database transactions to continue even during maintenance and enables the distribution of database workloads across separate instances.

Business users benefit when they can run compute-intensive queries and reports against the replicated database without slowing the production database. Replication software frees up database resources for paying customers and front-line users while keeping the data in sync in multiple places.

Migrating databases

Real digital transformation takes almost every company into cloud computing sooner or later. It’s a chance to move from the CapEx world of buying and maintaining infrastructure to the OpEx world of leasing space from public cloud providers. For DBAs, migration to almost boundless cloud resources requires minimal IT effort and is relatively painless.

Painless, that is, if everything goes well. One of the biggest questions surrounding migration to the cloud is “How do we know we’re selecting the best fit in a cloud provider and service level?” DBAs spend a lot of time estimating costs, dependencies, availability and capacity before flipping the switch and migrating workloads to the cloud. Still, it’s hard to avoid the shock of unexpected costs when the first bill arrives

Another big question is “How can we reduce the risk of downtime and data loss during the migration?” There is almost no way to eliminate risk completely, but the right tools can mitigate it by testing and monitoring performance at each step on the migration path.

For the company in general

“Finally,” you tell the exec, “we think about database management in the context of the company as a whole.”

Protecting personal information

The changing regulatory environment has made companies responsible for data privacy. That means going beyond protecting databases from unauthorized access; it also means finding information across multiple databases that contains personally identifiable information (PII). And now it means protecting that information to comply with data privacy regulations.

Few medium-sized and large businesses can do that manually, because they run so many applications and databases. Database developers and database administrators are well placed to find the information, but it makes no sense to have them pore over the column names of thousands of tables.

Automated tools sample data across all tables, using a range of expressions and rules to find sensitive data. They notify database professionals and prompt for redaction and encryption techniques. For ongoing protection, they continually audit the databases for changes that introduce new kinds of PII, reducing risk and ensuring the company remains compliant.

What’s your systematic approach to database management?

“Okay,” says the exec. “I can see you’ve thought it through. Let me know what you need from me.”

How does your company approach database management? Do you break it down by function? By stakeholder? Or do you fly by the seat of your pants? How do you explain it all to your execs?

No matter what your approach entails, you’ll want to have answers.

About the Author

John Pocknell

John Pocknell is a senior market strategist at Quest Software and part of the Information Management business unit. Based at the European headquarters in the U.K., John is responsible for synthesizing analyst data and customer interviews in order to create and evangelize solutions-based stories and messaging which relate to major IT initiatives for our extensive portfolio of database products, worldwide. He has been with Quest Software since 2000, working in the database design, development and deployment product areas and spent over 10 years as product manager for the Toad product line. John has been successfully evangelizing Toad and other database solutions at various conferences and user groups around the world for the last 19 years as well as writing blogs and technical papers both internally and for the media. John has worked in IT for more than 30 years, most of that time in Oracle application design and development. He is a qualified aeronautical engineer with more than 10 years of experience in provisioning IT consultancy services and implementing quality assurance systems to ISO 9001.

Related Articles