PodcastsEducationOracle University Podcast

Oracle University Podcast

Oracle Corporation
Oracle University Podcast
Latest episode

159 episodes

  • Oracle University Podcast

    Introduction to Oracle AI Vector Search

    24/03/2026 | 15 mins.
    Explore Oracle AI Vector Search and learn how to find data by meaning, not just keywords, using powerful vector embeddings within Oracle Database 23ai. In this episode, hosts Lois Houston and Nikita Abraham, along with Senior Principal APEX & Apps Dev Instructor Brent Dayley, break down how similarity search works, the new VECTOR data type, and practical steps for implementing secure, AI-powered search across both structured and unstructured data.
     
    Oracle AI Vector Search Fundamentals: https://mylearn.oracle.com/ou/course/oracle-ai-vector-search-fundamentals/140188/
    Oracle University Learning Community: https://education.oracle.com/ou-community
    LinkedIn: https://www.linkedin.com/showcase/oracle-university/
    X: https://x.com/Oracle_Edu
     
    Special thanks to Arijit Ghosh, Anna Hulkower, Kris-Ann Nansen, and the OU Studio Team for helping us create this episode.
     
    ----------------------------------------------------

    Episode Transcript:
     
    00:00
    Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started!
    00:26
    Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Communications and Adoption Programs with Customer Success Services, and with me is Nikita Abraham, Team Lead: Editorial Services with Oracle University.
    Nikita: Hi everyone! Today, we're beginning a brand-new season, this time on Oracle AI Vector Search. Whether you're new to vector searches or you've already been experimenting with AI and data, this episode will help you understand why Oracle's approach is such a game-changer.
    Lois: To make sure we're all starting from the same place, here's a quick overview. Oracle AI Vector Search lets you go beyond traditional database searches. Not only can you find data based on specific attribute values or keywords, but you can also search by meaning, using the semantics of your data, which opens up a whole new world of possibilities.
    01:20
    Nikita: That's right, Lois. And guiding us through this episode is Senior Principal APEX & Apps Dev Instructor Brent Dayley. Hi Brent! What's unique about Oracle's approach to vector search? What are the big benefits?
    Brent: Now one of the biggest benefits of Oracle AI Vector Search is that semantic search on unstructured data can be combined with relational search on business data, all in one single system. This is very powerful, and also a lot more effective because you don't need to add a specialized vector database. And this eliminates the pain of data fragmentation between multiple systems. 
    It also supports Retrieval Augmented Generation, also known as RAG. Now this is a breakthrough generative AI technique that combines large language models and private business data. And this allows you to deliver responses to natural language questions. RAG provides higher accuracy and avoids having to expose private data by including it in the large language model training data. 
    02:41
    Lois: OK, and can you explain what the new VECTOR data type is?
    Brent: So, this data type was introduced in Oracle Database 23ai. And it allows you to store vector embeddings alongside other business data. 
    Now, the vector data type allows a foundation to store vector embeddings. This allows you to store your business data in the database alongside your unstructured data, and allows you to use those in your queries. So it allows you to apply semantic queries on business data. 
    03:24
    Lois: For many of our listeners, "vector embeddings" might be a new term. Can you explain what vector embeddings are?
    Brent: Vector embeddings are mathematical representations of data points. They assign mathematical representations based on meaning and context of your unstructured data. 
    You have to generate vector embeddings from your unstructured data either outside or within the Oracle Database. In order to get vector embeddings, you can either use ONNX embedding machine learning models or access third-party REST APIs. 
    Embeddings can be used to represent almost any type of data, including text, audio, or visual such as pictures. And they are used in proximity searches. 
    04:19
    Nikita: Now, searching with these embeddings isn't about looking for exact matches like traditional search, right? This is more about meaning and similarity, even when the words or images differ? Brent, how does similarity search work in this context?
    Brent: So vector data is usually unevenly distributed and clustered. Vector data tends to be unevenly distributed and clustered into groups that are semantically related. Doing a similarity search based on a given query vector is equivalent to retrieving the k nearest vectors to your query vector in your vector space. 
    What this means is that basically you need to find an ordered list of vectors by ranking them, where the first row is the closest or most similar vector to the query vector. The second row in the list would be the second closest vector to the query vector, and so on, depending on your data set. What we need to do is to find the relative order of distances. And that's really what matters rather than the actual distance. 
    Now, similarity searches tend to get data from one or more clusters, depending on the value of the query vector and the fetch size. Approximate searches using vector indexes can limit the searches to specific clusters. Exact searches visit vectors across all clusters. 
    05:51
    Lois: Let's talk about how we actually convert information into these vectors. There are models behind the scenes, right? Kind of like translators between words, images, and numbers. Brent, what embedding models does Oracle support, and how do they handle different data types?
    Brent: Vector embedding models allow you to assign meaning to what a word, or a sentence, or the pixels in an image, or perhaps audio. What that actually means? It allows you to quantify features or dimensions. 
    Most modern vector embeddings use a transformer model. Bear in mind that convolutional neural networks can also be used. Depending on the type of your data, you can use different pretrained open-source models to create vector embeddings. As an example, for textual data, sentence transformers can transform words, sentences, or paragraphs into vector embeddings. 
    For visual data, you can use residual network, also known as ResNet, to generate vector embeddings. You can also use visual spectrogram representation for audio data. And that allows us to use the audio data to fall back into the visual data case. Now, these can also be based on your own data set. Each model also determines the number of dimensions for your vectors. 
    As an example, Cohere's embedding model, embed English version 3.0, has 1,024 dimensions. Open AI's embedding model, text-embedding-3-large, has 3,072 dimensions. 
    07:45
    Nikita: For organizations ready to put this into practice, there's the question of how to get the models up and running inside Oracle Database. Can you walk us through how these models are brought into Oracle Database?
    Brent: Although you can generate vector embeddings outside the Oracle Database using pre-trained open-source embeddings or your own embedding models, you also have the option of doing those within the Oracle Database. In order to use those within the Oracle Database, you need to use models that are compatible with the Open Neural Network Exchange Standard, or ONNX, also known as onn-ex. 
    Oracle Database implements an ONNX runtime directly within the database, and this is going to allow you to generate vector embeddings directly inside the Oracle Database using SQL. 
    08:41
    AI is transforming every industry. So, it's no wonder that AI skills are the most sought-after by employers. If you're ready to dive into AI, check out the OCI AI Foundations training and certification that's available for free! It's the perfect starting point to build your AI knowledge. Head over to mylearn.oracle.com to kickstart your AI journey today!
    09:06
    Nikita: Welcome back! Let's make this practical. Imagine I'm setting this up for the first time. What are the big steps? Can you walk us through the end-to-end workflow using Oracle AI Vector Search?
    Brent: Generate vector embeddings from your data, either outside the database or within the database. Now, embeddings are a mathematical representation of what your data meaning is. So, what does this long sentence mean, for instance? What are the main keywords out of it?
    You can also generate embeddings not only on your typical string type of data, but you can also generate embeddings on other types of data, such as pictures or perhaps maybe audio wavelengths. 
    Maybe we want to convert text strings to embeddings or convert files into text. And then from text, maybe we can chunk that up into smaller chunks and then generate embeddings on those chunks. Maybe we want to convert files to embeddings, or maybe we want to use embeddings for end-to-end search. 
    Now you have to generate vector embeddings from your unstructured data, as we mentioned, either outside or within the Oracle Database. You can either use the ONNX embedding machine learning models or you can access third-party REST APIs. 
    You can import pretrained models in ONNX format for vector generation within the database. You can download pretrained embedding machine learning models, convert them into the ONNX format if they are not already in that format. Then you can import those models into the Oracle Database and generate vector embeddings from your data within the database. 
    Oracle also allows you to convert pre-trained models to the ONNX format using Oracle machine learning for Python. This enables the use of text transformers from different companies. 
    11:36
    Nikita: Once those embeddings are generated, what's the next step? 
    Brent: Store vector embeddings. So you can create one or more columns of the vector data type in your standard relational data tables. You can also store those in secondary tables that are related to the primary tables using primary key foreign key relationships. 
    You can store vector embeddings on structured data and relational business data in the Oracle Database. You do store the resulting vector embeddings and associated unstructured data with your relational business data inside the Oracle Database. 
    12:17
    Lois: And when do vector indexes come into play? 
    Brent: Now you may want to create vector indexes in the event that you have huge vector spaces. This is an optional step, but this is beneficial for running similarity searches over those huge vector spaces. 
    12:38
    Nikita: Now, once all of that is in place, how do users perform similarity searches? 
    Brent: So once you have generated the vector embeddings and stored those vector embeddings and possibly created the vector indexes, you can then query your data with similarity searches. This allows for native SQL operations and allows you to combine similarity searches with relational searches in order to retrieve relevant data. 
    So let's take a look at the combined complete workflow. Step number one, generate the vector embeddings from your unstructured data. Step number two, store the vector embeddings. Step number three, create vector indexes. And step number four, combine similarity and keyword searches. 
    Now there is another optional step. You could generate a prompt and send it to a large language model for a full RAG inference. You can use the similarity search results to generate a prompt and send it to your generative large language model in order to complete your RAG pipeline. 
    14:07
    Lois: Thanks for that detailed walk-through, Brent. To sum up, today we introduced Oracle AI Vector Search, discussed its core concepts, data types, embedding models, and the complete workflow you'll use to get real value out of your business data, securely and efficiently. 
    Nikita: If you want to learn more about the topics we discussed today, go to mylearn.oracle.com and search for the Oracle AI Vector Search Fundamentals course. And if you're feeling inspired to try this out for yourself, don't forget to check out the Oracle Database 23ai SQL Workshop for hands-on training. Until next time, this is Nikita Abraham…
    Lois: And Lois Houston, signing off!
    14:49
    That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
  • Oracle University Podcast

    Exploring the Oracle Analytics AI Assistant

    17/03/2026 | 17 mins.
    Join hosts Lois Houston and Nikita Abraham for a special episode of the Oracle University Podcast as they explore the Oracle Analytics AI Assistant. In this episode, you'll discover how Oracle's AI-powered conversational tool empowers users of all backgrounds to interact with business data using simple, natural-language questions. Learn how the assistant interprets queries, surfaces visualizations, and delivers actionable insights in seconds, all within Oracle's secure analytics environment. The episode dives into best practices for data preparation, security and privacy safeguards, how to configure datasets for optimal AI performance, and tips for getting the most relevant results. You'll also hear how synonyms, column indexing, and user permissions make analytics more accessible and accurate.
     
    Visualize Data with the Oracle Analytics AI Assistant: https://mylearn.oracle.com/ou/article-course/visualize-data-with-the-oracle-analytics-ai-assistant/156941/
    Oracle University Learning Community: https://education.oracle.com/ou-community
    LinkedIn: https://www.linkedin.com/showcase/oracle-university/
    X: https://x.com/Oracle_Edu
     
    Special thanks to Arijit Ghosh, Anna Hulkower, Kris-Ann Nansen, and the OU Studio Team for helping us create this episode.
    -------------------------------------------------------
     
    Episode Transcript:
     
    00:00
    Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started!
    00:26
    Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Communications and Adoption Programs with Customer Success Services, and with me is Nikita Abraham, Team Lead: Editorial Services with Oracle University.
    Nikita: Hi everyone! Today's episode is on the Oracle Analytics AI Assistant, which is all about making business data accessible and useful, no matter your background. Whether you're a seasoned pro or just starting out with Oracle Analytics, you'll want to stick around for this episode because we're covering everything you need to know to unlock powerful, intuitive, and secure data insights.
    01:06
    Lois: That's right. And full disclosure before we start. We're trying something a little different for this episode. Instead of a live guest, our expert will be an AI-generated voice sharing insights drawn directly from Oracle's official course materials. Think of it as getting a taste of what our training courses are like, with a little help from AI. So, with that, let's kick things off by taking a closer look at what the Oracle Analytics AI Assistant really is.
    Expert: The Oracle Analytics AI Assistant is an AI-powered tool that provides a conversational interface for data analysis. With this tool, data exploration becomes more intuitive and efficient, helping you access fast, personalized insights. The AI Assistant makes use of Generative AI to process queries, analyze indexed datasets, and create or refine relevant visualizations. It is fully integrated into the Oracle Analytics platform, complementing existing analytic and visualization capabilities.
    02:13
    Nikita: So, put simply, users have the ability to interact with their data in plain English and receive immediate, visual answers.
    Expert: Exactly! You can ask natural language questions, such as, "What were my sales in the United States last Tuesday?" or "Show me monthly sales for this year," and the assistant interprets the question, queries the right data, and generates the best visualization.
    02:39
    Lois: Before we dive deeper, let's ground ourselves in some of the core concepts behind this technology. Here's an overview of the AI technologies powering the assistant.
    Expert: 
    - Artificial Intelligence refers to systems or machines that perform tasks which typically require human intelligence, like reasoning, learning, perception, and language understanding.
    - Large Language Models or LLMs are AI programs trained on very large data sets. LLMs can generate human-like language and perform complex language tasks, such as writing emails or answering questions.
    - Generative AI is a branch of AI that can create new content, such as text, images, and audio. GenAI includes chatbots and virtual assistants capable of human-like conversations, answering questions, and creating content based on user prompts.
    - Natural Language Processing or NLP is a subfield of AI, targeting how computers understand and generate human language.
    03:42
    Lois: Now, let's look at what happens behind the scenes when someone interacts with the Oracle Analytics AI Assistant.
    Expert: Here is how the process works. You ask a question or make a request in natural language. Oracle Analytics Cloud identifies the most relevant dataset to answer that question, looking at metadata and attribute values. The platform prepares a prompt for the LLM that includes dataset metadata, column names, synonyms, and your question. The LLM and Natural Language Understanding interpret the question, and then translate it into a structured query. Oracle Analytics validates this query against your data model, and then queries your database.
    Based on the results, the AI Assistant creates the most appropriate visualization, like a chart, table, or similar format, and provides additional natural language insights.
    04:36
    Nikita: Security and privacy are top priorities for organizations using tools like this, so let's get into Oracle's approach to protecting user data.
    Expert: At Oracle, your data privacy and security are always top priorities. Specifically, your data is never shared with external model providers or other customers. Pre-trained generative AI models are accessed exclusively within Oracle's secure cloud infrastructure. No customer data is stored or retained by the AI models after processing, and prompt data is not used to train the models. And finally, all data processed is fully isolated and never combined or visible to anyone outside your organization.
    05:20
    Lois: In other words, users always remain in full control of their own data, with no risk of leakage or exposure to outside parties.
    Nikita: Yeah, this kind of reassurance is absolutely critical for enterprises.
    05:32
    Lois: That's right, Niki. Next, let's cover how to get the most accurate and relevant insights from the AI Assistant by following some best practices for prompting.
    Expert: To get the best answers, you need to be specific. Include key data points, timeframes, or filters. For example, something like: "Show total sales by country for Q2 2024." Keep questions focused, clear, and concise. Refine your request as needed. If you want different details or a simpler trend line, follow up with something like, "Show by quarter," or "Replace product category with customer segment." Avoid complex prompts, like highly nested or multi-step ones. Ask a series of concise questions instead. When typing column names or field values, pause briefly to let the Assistant suggest the correct field. This increases prompt accuracy. Consider the context of the conversation. Filters and refinements made in previous messages persist, so be aware that context builds over the conversation unless reset.
    06:36
    Nikita: So, you might start with something like, "Show me sales trends for the last 5 years," and then get more granular, like, "Include only technology products," or "Break the results down by product sub-category."
    Lois: But sometimes, you may just want to start from scratch, so let's discuss how you can reset your session with the AI Assistant.
    Expert: Just select the "Clear Assistant History" option and you can begin a new analysis.
    07:03
    Nikita: Language capabilities are another important consideration, so here's an overview of which languages the Assistant currently supports.
    Expert: Right now, English is the primary language supported. Simple questions in other languages may work, but with less accuracy and fewer features. Talk to your Oracle Analytics administrator if you have multilingual needs.
    07:26
    Lois: Let's clarify what kinds of questions are beyond the scope of the Assistant.
    Expert: The Assistant is built for business-oriented, goal-driven queries, not for technical schema questions or database logic. So, don't ask about dataset structures or technical metadata. But do ask about trends, comparisons, breakdowns, and summaries that relate to your business.
    07:53
    Do you want to fast-track your learning goals? Join us for live events hosted by Oracle expert instructors! Get certification exam tips, learn about new technology, and ask your questions in real time. Take charge of your learning. Visit mylearn.oracle.com and join a live event today! 
    08:13
    Nikita: Welcome back! Now, let's discuss why configuring datasets is crucial for working effectively with the AI Assistant.
    Expert: Effectively indexing and configuring your dataset can make a huge difference when working with the AI Assistant. When you index a dataset, you're basically creating searchable references. This makes it easier for the AI Assistant to quickly locate the most relevant columns and give accurate responses to natural language questions. 
    It's important to know that you'll need to manually select which columns to index. For example, if your users are likely to ask about sales in the United States, you'll want to make sure that both the "Country" column and the "Sales" column are included when indexing. That way, the Assistant knows exactly where to look when someone asks a question about U.S. sales figures. 
    Another thing to remember is that you can make your analytics more user-friendly by resolving ambiguities and assigning synonyms to your dataset columns. For instance, if there's a generic "date" column, clarify whether that refers to the "order date" or the "ship date." It helps to add synonyms as well, so the assistant can handle different ways users might phrase their questions. 
    So, while it may take a little extra effort upfront, making your dataset easy to search and understand pays off. Your AI Assistant can respond quickly and accurately, and your users get the answers they're looking for with less hassle. 
    09:43
    Lois: Next, we'll outline the steps for configuring and indexing datasets for optimal performance.
    Expert: First you need to confirm dataset access. You'll need read/write privileges to enable the AI Assistant and index the dataset. Then, on the Search tab, under "Index Dataset For," select "Assistant." Choose your language and, optionally, set an indexing schedule. Carefully pick columns users will likely question, like sales, region, or date. Avoid technical metadata, sensitive data, and high-cardinality columns like Customer IDs. Choose whether to index only column names or names plus data values. Including data values helps with typing suggestions and nuance. Avoid values no one will search on. Importantly, indexed dataset values are never sent to the LLM. They are retrieved from the dataset when visualizations are created. Assign synonyms to attribute names. Oracle Analytics suggests synonyms, but you can also add your own. Finally, save the changes and run indexing to make the dataset searchable by the Assistant.
    10:50
    Nikita: Now, let's look at how configuring subject areas can further tailor the experience.
    Expert: You'll need to navigate to the Search Index by going through the Console's Configuration and Settings. Choose your language and indexing schedule. Index folders relevant to business questions; avoid non-relevant or sensitive columns. Select the Index Type: "Index Metadata Only" for high-cardinality columns (like IDs); "Index" for columns and values that users reference. As with datasets, clarify column meanings with user-friendly synonyms. Finalize settings and run the index to prepare your subject area for AI-powered queries.
    Special care must be taken with date columns. Select and clearly identify the main business date so queries don't become ambiguous.
    11:39
    Lois: Synonyms play an important role in reducing ambiguity and enhancing results, so let's review the best practices for setting them up effectively.
    Expert: If your columns use abbreviations, acronyms, or codes—like "custNo" or "Pname"—it's a good idea to provide synonyms to clarify what those attributes actually mean. Think about how people typically refer to those columns in everyday language. So instead of just "custNo," add "Customer Number" as a synonym, and for "Pname," you would use "Product Name." 
    If you can, actually renaming the column is usually more effective than just adding a synonym. But if that's not possible for some reason, a synonym is the next best thing. 
    Dates can be another tricky area. Datasets often have several date columns, like "Ship Date," "Order Date," and "Invoice Date." If a user asks, "Show me revenue by date," the system has to decide which date column to use, and it may just pick one for you. If you definitely want "Order Date" to be considered the default date, make sure to assign "date" as a synonym specifically for that column. 
    There's also the situation where different tables have columns with the same name—like "name" from both a Product table and an Employee table. You'll want to use synonyms for these columns too, to make it clear what each one means. 
    Adding more than one synonym can help as well. For example, if you have a "Yield" column, maybe also specify "revenue" and "income" as synonyms, so users can ask questions however they naturally would. 
    Avoid using reserved words or special characters in your synonyms. This means words like "Count," "Year," or anything that's also a SQL function, plus characters like "@" or special symbols. Also, steer clear of Unicode characters and terms that are analytical functions or date formats. 
    The whole point is to make your columns easy for business users or anyone else to reference naturally, using the terms they're most likely to try in a search. 
    And finally, just a few rules of thumb: synonyms can be up to 50 characters long, you can use up to 20 synonyms for each column, and you don't need to worry about uppercase or lowercase; column names aren't case sensitive. 
    Besides the basic setup and using synonyms, you can really improve the quality of answers from the AI Assistant (and the LLM it uses) by prepping and enriching your data. It's easier for the AI to work with words than numbers. Try "binning" numerical values into simple categories people can understand. For instance, instead of showing a long list of sales amounts, split them into groups like "small," "medium," and "large." 
    LLMs handle words better than blanks. If your data has missing or null values, fill them in with something meaningful, like "Unknown," "Not specified," or "Not available." Skipping this step could cause errors in queries, such as reports missing customers because their country is blank. Incorrect averages or summaries, especially if missing values are ignored. Issues with forecasting, if data gaps throw off trends. The AI Assistant might skip important columns or even generate errors.
    Ambiguous or duplicate column names confuse both users and the LLM. Make your names clear and consistent. 
    You can use Oracle Analytics's Transform editor to add even more context. For example, you might extract the day of the week from a date, so you can easily ask, "Show sales for all Fridays in 2026." 
    By preparing your data with these steps, you help the AI Assistant give you more accurate and insightful answers, making data analysis a lot smoother!
    15:27
    Nikita: Finally, let's walk through the process of making the Oracle Analytics AI Assistant accessible to end users directly within their workbooks.
    Expert: Permissions are controlled through application roles. Your administrator must create a specific role enabling access to the AI Assistant.
    To enable consumer access, open your workbook in edit mode and select Present. From the Workbook tab, toggle it on in the Insights Panel section. Choose tabs like Watch Lists and Workbook Assistant. Decide which data sources in your workbook are available to the consumer.
    Save, and then use Preview to simulate the user experience.
    Consumers can access the AI Assistant by selecting Auto Insights at the top of the workbook. They can then type in natural language questions, review visualizations, and follow up.
    Repeat these steps for each workbook you wish to enable.
    16:22
    Lois: This really puts agile, self-service analytics at everyone's fingertips, all while keeping data security and integrity front and center.
    Nikita: And it's not just plug-and-play. To get the best results, you configure your data, enrich it, apply the right synonyms and permissions, and then your team can ask questions and visualize results just by using natural language.
    Lois: If you're ready to kickstart or deepen your journey with the Oracle Analytics AI Assistant, or you want to review the topics we covered in today's episode in even greater detail, visit mylearn.oracle.com.
    Nikita: That wraps up this episode. Thanks for spending time listening to us today. Join us next week for another episode of the Oracle University Podcast. Until then, this is Nikita Abraham…
    Lois: And Lois Houston, signing off!
    17:14
    That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
  • Oracle University Podcast

    Oracle Database@AWS: Monitoring, Logging, and Best Practices

    10/03/2026 | 19 mins.
    Running Oracle Database@AWS is most effective when you have full visibility and control over your environment.
     
    In this episode, hosts Lois Houston and Nikita Abraham are joined by Rashmi Panda, who explains how to monitor performance, track key metrics, and catch issues before they become problems. Later, Samvit Mishra shares key best practices for securing, optimizing, and maintaining a resilient Oracle Database@AWS deployment.
     
    Oracle Database@AWS Architect Professional: https://mylearn.oracle.com/ou/course/oracle-databaseaws-architect-professional/155574
    Oracle University Learning Community: https://education.oracle.com/ou-community
    LinkedIn: https://www.linkedin.com/showcase/oracle-university/
    X: https://x.com/Oracle_Edu
     
    Special thanks to Arijit Ghosh, Anna Hulkower, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.
    ------------------------------------------------------
    Episode Transcript:

    00:00
    Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started!
    00:26
    Nikita: Welcome to the Oracle University Podcast! I'm Nikita Abraham, Team Lead: Editorial Services with Oracle University, and with me is Lois Houston, Director of Communications and Adoption with Customer Success Services 
    Lois: Hello again! Last week's discussion was all about how Oracle Database@AWS stays secure and available. Today, we're joined by two experts from Oracle University. First, we'll hear from Rashmi Panda, Senior Principal Database Instructor, who will tell you how to monitor and log Oracle Database@AWS so your environment stays healthy and reliable.
    Nikita: And then we're bringing in Samvit Mishra, Senior Manager, CSS OU Cloud Delivery, who will break down the best practices that help you secure and strengthen your Oracle Database@AWS deployment. Let's start with you, Rashmi. Is there a service that allows you to monitor the different AWS resources in real time?
    Rashmi: Amazon CloudWatch is the cloud-native AWS monitoring service that can monitor the different AWS resources in real time. It allows you to collect the resource metrics and create customized dashboards, and even take action when certain criteria is met. Integration of Oracle Database@AWS with Amazon CloudWatch enables monitoring the metrics of the different database resources that are provisioned in Oracle Database@AWS.
    Amazon CloudWatch collects raw data and processes it to produce near real-time metrics data. Metrics collected for the resources are retained for 15 months. This facilitates analyzing the historical data to understand and compare the performance, trends, and utilization of the database service resources at different time intervals. You can set up alarms that continuously monitor the resource metrics for breach of user-defined thresholds and configure alert notification or take automated action in response to that metric threshold being reached.
    02:19
    Lois: What monitoring features stand out the most in Amazon CloudWatch?
    Rashmi: With Amazon CloudWatch, you can monitor Exadata VM Cluster, container database, and Autonomous database resources in Oracle Database@AWS. Oracle Database@AWS reports metrics data specific to the resource in AWS/ODB namespace of Amazon CloudWatch. Metrics can be collected only when the database resource is an available state in Oracle Database@AWS.
    Each of the resource types have their own metrics defined in AWS/ODB namespace, for which the metrics data get collected. 
    02:54
    Nikita: Rashmi, can you take us through a few metrics?
    Rashmi: At Exadata database VM Cluster, there is CPU utilization, memory utilization, swap space storage file system utilization metric. Then there is load average on the server, what is the node status, and the number of allocated CPUs, et cetera.
    Then for container database, there is CPU utilization, storage utilization, block changes, parse count, execute count, user calls, which are important elements that can provide metrics data on database load. And for Autonomous Database metrics data include DB time, CPU utilization, logins, IOPS and IO throughput, RedoSize, parse, execute, transaction count, and few others.
    03:32
    Nikita: Once you've collected these metrics and analyzed database performance, what tools or services can you use to automate responses or handle specific events in your Oracle Database@AWS environment?
    Rashmi: Then there is Amazon EventBridge, which can monitor events from AWS services and respond automatically with certain actions that may be defined. You can monitor events from Oracle Database@AWS in EventBridge, which sends events data continuously to EventBridge at real time. Eventbridge forwards these events data to target AWS Lambda and Amazon Simple Notification Service to perform any actions on occurrence of certain events.
    Oracle Database@AWS events are structured messages that indicate changes in the life cycle of the database service resource. Eventbridge can filter events based on your defined rules, process them, and deliver to one or more targets. Event Bus is the router that receives the events, optionally transform them, and then delivers the events to the targets. Events from Oracle Database@AWS can be generated by two means: they can be generated from Oracle Database@AWS in AWS, and they can also be generated directly from OCI and received by EventBridge in AWS.
    You can monitor Exadata Database and Autonomous Database resource events. Ensure that the Exadata infrastructure status is an available state. You can configure how the events are handled for these resources. You can define rules in EventBridge to filter the events of interest and the target, who is going to receive and process those events. You can filter events based on a pattern depending on the event type, and apply this pattern using Amazon EventBridge put-rule API, with the default event bus to route only those matching events to targets.
    05:13
    Lois: And what about events that AWS itself generates?
    Rashmi: Events that are generated in AWS for the Oracle Database@AWS resources are delivered to the default event bus of your AWS account. These events that are generated in AWS for Oracle Database@AWS resources include lifecycle changes of the ODB network. The different network events are successful creation or failure of the creation of the ODB network, and successful deletion or failure in deletion of the ODB network.
    When you subscribe to Oracle Database@AWS, then an event bus with prefix aws.partner/odb is created in your AWS account. All events generated in OCI for the Oracle Database@AWS resources are then received in this event bus. When you are creating filter pattern using Amazon EventBridge put-rule API, you must set the event bus name to this event bus. Make sure you do not delete this event bus. Events generated in OCI and received into event bus are extensive. They include events of Oracle Exadata infrastructure, VM Cluster, container, and pluggable databases.
    06:14
    Lois: If you want to look back at what's happened in your environment, like who made the changes or accessed resources, what's the best AWS service for logging and auditing all that activity?
    Rashmi: Amazon CloudTrail is a logging service in AWS that records the different actions taken by a user or roles, or an AWS service. Oracle Database@AWS is integrated with Amazon Cloud Trail. This enables logging of all the different events on Oracle Database@AWS resources. 
    Amazon Cloud Trail captures all the API calls to Oracle Database@AWS as events. These API calls include calls from the Oracle Database@AWS console, and code calls to Oracle Database@AWS API operations. These log files are delivered to Amazon S3 bucket that you specify. These logs determine the identity of the caller who made the call request to Oracle Database@AWS, their IP from which the call originated, the time of the call, and some additional details. 
    CloudTrail event history stores immutable record of the past 90 days of management events in an AWS region. You can view, search, and download these records from CloudTrail Event History. You can access CloudTrail when you create an AWS account that automatically gives you the access to CloudTrail. Event history. If you would like to retain the logs for a longer period of time beyond 90 days, you can create CloudTrail trails or CloudTrail Lake event data store. 
    Management events in AWS provide information about management operations that are performed on the resources in your AWS account. Management operations are also called control plane operations. Thus, the control plane operations in Oracle Database@AWS are logged as management events in CloudTrail logs. 
    07:59
    Are you a MyLearn subscriber? If so, you're automatically a member of the Oracle University Learning Community! Join millions of learners, attend exclusive live events, and connect directly with Oracle subject matter experts. Enjoy the latest news, join challenges, and share your ideas. Don't miss out! Become an active member today by visiting mylearn.oracle.com.
    08:25
    Nikita: Welcome back! Samvit, let's talk best practices. What should teams keep in mind when they're setting up and securing their Oracle Database@AWS environment? 
    Samvit: Use IAM roles and policies with least privilege to manage Oracle Database@AWS resources. This ensures only authorized users can provision or modify DB resources, reducing the risk of accidental or malicious changes. 
    Oracle Data Safe monitors database activity, user risk, and sensitive data, while AWS CloudTrail records all AWS API calls. Together, they give full visibility across the database and cloud layers.
    Autonomous Database supports Oracle Database Vault for enforcing separation of duties. Exadata Database Service can integrate with Audit Vault and Database Firewall to prevent privileged users from bypassing security controls.
    Enable multifactor authentication for AWS IAM users managing Oracle Database@AWS. This adds a strong second layer of protection against stolen credentials. 
    Always deploy your Oracle Database@AWS in private subnets without public IPs. Use AWS security groups and NACLs to strictly limit inbound and outbound traffic, allowing access only from trusted applications.
    Exadata Database Service supports integration with Oracle Vault for key lifecycle management. And in case of Autonomous Database, the transparent data encryption keys are automatically managed. But you can bring your own keys with OCI Vault. Key rotation ensures compliance and reduces risk of key compromise.
    Oracle Database@AWS enforces encrypted connections by default. Ensure clients connect with TLS 1.2 or 1.3 to protect data in transit from interception or tampering. 
    Use Oracle Data Safe's user assessment features to detect dormant users or excessive privileges. Disable unused accounts and rightsize permissions to reduce insider threats and security gap.
    Export database audit logs to Oracle Data Safe Audit Vault or AWS S3 with object lock for immutability. This prevents lock tampering and ensures audit evidence is preserved for compliance. 
    11:25
    Lois: OK, that covers security. Do you have any tips for making sure your Oracle Database@AWS setup is reliable and resilient?
    Samvit: Start with clear recovery objectives. Define how much downtime and data loss each workload can tolerate. These targets drive your HADR architecture and backup strategy. 
    Implement business continuity measures to deliver maximum uptime for your databases. As a best practice, you must configure disaster recovery environment for your critical databases so that, in the event of any disaster affecting the primary database, applications can be immediately failed over to the DR environment, ensuring least application downtime and zero or minimal data loss. With Oracle Database@AWS, you can automate the creation and management of DR environment for your database services using different deployment capabilities. You can opt to configure either cross-availability zone DR in the same region or configure cross-region DR. Since cross-availability zone can only provide site failure protection, you must also configure a cross-region DR to protect against regional failure.
    A DR plan is only effective if tested. Regular failover and switchover drills validate that people, processes, and systems can recover as designed. 
    For Exadata Database, Autonomous Recovery Service provides automated backup validation, recovery guarantees, and protection against accidental data loss or corruption. 
    Oracle-managed backups are fully managed by OCI. When you create your Oracle Exadata Database, you can enable automatic backups by choosing Enable Automatic Backups in the OCI Console. When you do that, you can select Amazon S3 or OCI Object Storage or Autonomous Recovery Service as the backup destination.
    Don't just take backups. You also need to test them. Regularly restore backups into non-production environment to validate integrity and recovery time. 
    Plan beyond just the database. Map application and middleware dependencies to ensure end-to-end business resilience. A database failover is useless if dependent apps can't reconnect.
    14:09
    Nikita: Another area of interest is performance and cost. What practices help teams balance the two?
    Samvit: Autonomous Database automatically scales CPU and storage as workloads grow. This ensures performance during peaks while avoiding overprovisioning. So you should enable ADB auto-scaling. 
    Monitor CPU, memory, and IO metrics with AWS CloudWatch to rightsize your compute. Scale up or down based on actual utilization instead of static provisioning.
    Autonomous databases continuously evaluate and creates indexes automatically. This improves query performance without requiring manual tuning. 
    Use connection pooling in your applications to optimize database connections. Minimizing round-trip reduces latency and improves throughput.
    Apply AWS tags to database and related resources for cost allocation and chargeback. Tagging also helps with governance and cost visibility. 
    Choose between bring your own license and license-included models for Oracle Database@AWS. The right model depends on your existing license portfolio and cost strategy.
    Not all workloads need long backup retention. Adjust retention policies based on business needs to balance compliance with storage costs. 
    Exadata Database supports Oracle multitenant with pluggable databases. Consolidating databases reduces infrastructure footprint and licensing costs.
    Performance tuning isn't just technical. Align metrics with business KPIs. correlating DB performance to user experience and revenue impact helps prioritize optimizations. 
    16:20
    Lois: Before we wrap up, Samvit, let's look at operational efficiency. What advice do you have for making day-to-day operations more efficient?
    Samvit: Use infrastructure as code tools like Terraform or AWS CloudFormation to automate provisioning. This ensures consistent, repeatable deployments with minimal manual errors. 
    For Autonomous Database, enable auto-start/stop to optimize costs by running databases only when needed. This is ideal for dev test or seasonal workloads.
    Exadata Database Service provides fleet maintenance to patch multiple systems consistently. This reduces downtime and simplifies lifecycle management. 
    Integrate AWS CloudWatch for performance monitoring and EventBridge for event-driven automation. This helps detect issues early and trigger automated workflows.
    Oracle Data Safe provides ready-to-use audit and compliance reports. Use these to streamline governance and reduce the effort of manual compliance tracking. 
    For Autonomous databases, Performance Hub simplifies monitoring while Exadata users benefit from AWR and ASH reports. Together, they give deep insights into performance trends.
    Automated tagging policies and change management workflows help maintain governance. They ensure resources are tracked properly and changes are auditable. 
    Monitor storage consumption and growth patterns using AWS CloudWatch and the ADB Console. Proactive tracking helps avoid capacity issues and unexpected costs.
    Send CloudTrail logs into EventBridge to trigger automated incident responses. This shortens response time and builds operational resilience. 
    18:36
    Nikita: Samvit and Rashmi, thanks for spending time with us today. Your insights always help bring the bigger picture into focus.
    Lois: They definitely do. And if you'd like to go deeper into everything we covered, head over to mylearn.oracle.com and look up the Oracle Database@AWS Architect Professional course. Until next time, this is Lois Houston…
    Nikita: And Nikita Abraham, signing off!
    19:03
    That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
  • Oracle University Podcast

    How Oracle Database@AWS Stays Secure and Available

    03/03/2026 | 16 mins.
    When your business runs on data, even a few seconds of downtime can hurt. That's why this episode focuses on what keeps Oracle Database@AWS running when real-world problems strike.
     
    Hosts Lois Houston and Nikita Abraham are joined by Senior Principal Database Instructor Rashmi Panda, who takes us inside the systems that keep databases resilient through failures, maintenance, and growing workloads.
     
    Oracle Database@AWS Architect Professional: https://mylearn.oracle.com/ou/course/oracle-databaseaws-architect-professional/155574
    Oracle University Learning Community: https://education.oracle.com/ou-community
    LinkedIn: https://www.linkedin.com/showcase/oracle-university/
    X: https://x.com/Oracle_Edu
     
    Special thanks to Arijit Ghosh, Anna Hulkower, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.
    --------------------------------------------------
     
    Episode Transcript:

    00:00
    Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started!
    00:26
    Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Communications and Adoption with Customer Success Services, and with me is Nikita Abraham, Team Lead: Editorial Services with Oracle University.
    Nikita: Hi everyone! In our last episode, we explored the security and migration strengths of Oracle Database@AWS. Today, we're joined once again by Senior Principal Database Instructor Rashmi Panda to look at how the platform keeps your database available and resilient behind the scenes.
    01:00
    Lois: It's really great to have you with us, Rashmi. As many of you may know, keeping critical business applications running smoothly is essential for success. And that's why it's so important to have deployments that are highly resilient to unexpected failures, whether those failures are hardware-, software-, or network-related. With that in mind, Rashmi, could you tell us about the Oracle technologies that help keep the database available when those kinds of issues occur?
    Rashmi: Databases deployed in Oracle Database@AWS are built on Oracle's Foundational High Availability Architecture. Oracle Real Application Cluster or Oracle RAC is an Active-Active architecture where multiple database instances are concurrently running on separate servers, all accessing the same physical database stored in a shared storage to simultaneously process various application workloads.
    Even though each instance runs on a separate server, they collectively appear as a single unified database to the application. As the workload grows and demands additional computing capacity, then new nodes can be added to the cluster to spin up new database instances to support additional computing requirements. This enables you to scale out your database deployments without having to bring down your application and eliminates the need to replace existing servers with high-capacity ones, offering a more cost-effective solution.
    02:19
    Nikita: That's really interesting, Rashmi. It sounds like Oracle RAC offers both scalability and resilience for mission-critical applications. But of course, even the most robust systems require regular maintenance to keep them running at their best. So, how does planned maintenance affect performance? 
    Rashmi: Maintenance on databases can take a toll on your application uptime. Database maintenance activities typically include applying of database patches or performing updates. Along with the database updates, there may also be updates to the host operating system. These operations often demand significant downtime for the database, which consequently leads to slightly higher application downtime.
    Oracle Real Application Cluster provides rolling patching and rolling upgrades feature, enabling patching and upgrades in a rolling fashion without bringing down the entire cluster that significantly reduces the application downtime. 
    03:10
    Lois: And what happens when there's a hardware failure? How does Oracle keep things running smoothly in that situation?
    Rashmi: In the event of an instance or a hardware failure, Oracle RAC ensures automatic service failover. This means that if one of the instance or node in the cluster goes down, the system transparently failovers the service to an available instance in the cluster, ensuring minimal disruption to your application.
    This feature enhances the overall availability and resilience of your database. 
    03:39
    Lois: That sounds like a powerful way to handle unexpected issues. But for businesses that need even greater resilience and can't afford any downtime, are there other Oracle solutions designed to address those needs?
    Rashmi: Oracle Exadata is the maximum availability architecture database platform for Oracle databases. Core design principle of Oracle Exadata is built around redundancy, consisting of networking, power supplies, database, and storage servers and their components.
    This robust architecture ensures protection against the failure of any individual component, effectively guaranteeing continuous database availability. The scale out architecture of Oracle Exadata allows you to start your deployment with two database servers and three storage servers, having different number of CPU cores and different sizes and types of storage to meet the current business needs.
    04:26
    Lois: And if a business suddenly finds demand growing, how does the system handle that? Is it able to keep up with increased needs without disruptions?
    Rashmi: As the demand increases, the system can be easily expanded by adding more servers, ensuring that the performance and capacity grow with your business requirements. Exadata Database Service deployment in Oracle Database@AWS leverages this foundational technologies to provide high availability of database system. This is achieved by provisioning databases using Oracle Real Application Cluster, hosted on the redundant infrastructure provided by Oracle Exadata Infrastructure Platform.
    This deployment architecture provides the ability to scale compute and storage to growing resource demands without the need for downtime. You can scale up the number of enabled CPUs symmetrically in each node of the cluster when there is a need for higher processing power or you can scale out the infrastructure by adding more database and storage servers up to the Exadata Infrastructure model limit, which in itself is huge enough to support any large workloads.
    The Exadata Database Service running on Oracle RAC instances enables any maintenance on individual nodes or patching of the database to be performed with zero or negligible downtime. The rolling feature allows patching one instance at a time, while services seamlessly failover to the available instance, ensuring that the application experienced little to no disruption during maintenance.
    Oracle RAC, coupled with Oracle Exadata redundant infrastructure, protects the Database Service from any single point of failure. This fault-tolerant architecture features redundant networking and mirrored disk, enabling automatic failover in the event of a component failure. Additionally, if any node in the cluster fails, there is zero or negligible disruption to the dependent applications.
    06:09
    Nikita: That's really impressive, having such strong protection against failures and so little disruption, even during scaling and maintenance. But let's say a company wants those high-availability benefits in a fully managed environment, so they don't have to worry about maintaining the infrastructure themselves. Is there an option for that?
    Rashmi: Similar to Oracle Exadata Database Service, Oracle Autonomous Database Service on dedicated infrastructure in Oracle Database@AWS also offers the same feature, with the key difference being that it's a fully managed service. This means customers have zero responsibility for maintaining and managing the Database Service.
    This again, uses the same Oracle RAC technology and Oracle Exadata infrastructure to host the Database Service, where most of the activities of the database are fully automated, providing you a highly available database with extreme performance capability. It provides an elastic database deployment platform that can scale up storage and CPU online or can be enabled to autoscale storage and compute.
    Maintenance activities on the database like database updates are performed automatically without customer intervention and without the need of downtime, ensuring seamless operation of applications.
    07:20
    Lois: Can we shift gears a bit, Rashmi? Let's talk about protecting data and recovering from the unexpected. What Oracle technologies help guard against data loss and support disaster recovery for databases?
    Rashmi: Oracle Database Autonomous Recovery Service is a centralized backup management solution for Oracle Database services in Oracle Cloud Infrastructure.
    It automatically takes backup of your Oracle databases and securely stores them in the cloud. It ensures seamless data protection and rapid recovery for your database. It is a fully managed solution that eliminates the need for any manual database backup management, freeing you from associated overhead.
    It implements an incremental forever backup strategy, a highly efficient approach where only the changes since the last backup are identified and backed up. This approach drastically reduces the time and storage space needed for backup, as the size of the incremental changes is significantly lower than the full database backup.
    08:17
    Nikita: And what's the benefit of using this backup approach?
    Rashmi: The benefit of this approach is that your backups are completed faster, with much lesser compute and network resources, while still guaranteeing the full recoverability of your database in the event of a failure. You can achieve zero data loss with this backup service by enabling the real-time protection option, while minimizing the data loss by recovering data up to the last subsecond.
    It is highly recommended to enable this option for mission-critical databases that cannot tolerate any data loss, whether due to a ransomware attack or due to an unplanned outage. The protection policy can retain the protected database backups for a minimum of 14 days to a maximum of 95 days.
    The recovery service requires and enforces the backups are encrypted. These backups are compressed and encrypted during the backup process. The integrity of the backups is continuously validated without placing a burden on the production database.
    This ensures that the stored backup data is consistent and recoverable when needed. This protects against malicious user activity or any ransomware attack. With strict policy-based retention strategy, it prevents modification or deletion of backup data by malicious users.
    09:30
    Lois: Now, let's look at the next layer of protection. Rashmi, can you tell us about Oracle Active Data Guard?
    Rashmi: Oracle Active Data Guard provides highly available data protection and disaster recovery for Enterprise Oracle Databases. It creates and manages one or more transactionally consistent standby copies of production database, which is the active primary.
    The standby database is isolated from production environment located miles away in a distance data center, ensuring the standby remains protected and unaffected, even if the primary is impacted by a disaster.
    In the event of a disaster or data corruption occurring at the primary, the standby can take over the role as new primary, thus allowing business to continue its operations uninterrupted. It keeps the standby database in sync with the production database by continuously applying change logs from production.
    10:25
    Do you want to stay ahead in today's fast-paced world? Check out our New Features courses for Oracle Fusion Cloud Applications. Each quarter brings new updates and hands-on training to keep your skills sharp and your knowledge current. Head over to mylearn.oracle.com to dive into the latest advancements!
    10:45
    Nikita: Welcome back! Rashmi, how does Oracle Active Data Guard operate in practice?
    Rashmi: It uses the knowledge of Oracle Database block format to continuously validate physical blocks or logical intrablock corruption during redo transport and change apply. With automatic block repair feature, whenever any corrupt block is detected in the primary or the standby database, then it is automatically repaired by transferring a good copy of the block from other destination that holds it. This is handled transparently without any error being reported in the application.
    It enables you to upload the read-only workloads and backup operations to the standby database, reducing the load on the production database. You can achieve zero data loss at any distance by configuring a special synchronization mechanism known as parsing.
    File systems form the attack surface for ransomware. Since Active Data Guard replicates the data at the memory level, any ransomware attack on the primary database will never be replicated to the standby database. This allows for a safe failover to the standby without any data loss, and shielding the database from effects of the attack.
    You can enable automatic failover of the primary database to a chosen standby database without any manual intervention by configuring a Data Guard Broker. The Data Guard Broker continuously monitors the primary database and automatically performs a failover to the standby when the predefined failover conditions are met. Active Data Guard enables you to perform database maintenance or database software upgrades with almost zero or minimal downtime.
    12:18
    Lois: And how does disaster recovery work for Exadata Database Service in Oracle Database@AWS?
    Rashmi: Exadata Database Service, by design, are already protected against local failures by use of technologies like Oracle RAC and Oracle Exadata.
    Now, by deploying Exadata Database Service across multiple availability zones in an AWS region, it can ensure that your database services remain resilient to site failures. It leverages Oracle Active Data Guard to create standby in a separate availability zone such that if the primary availability zone is affected, then all application traffic can be routed to the database services in the secondary availability zone, restoring business continuity of the application back to normal.
    Through continuous validation of the data blocks at both the primary and the standby database, any potential corruption is detected and prevented. This ensures data integrity and protection across the entire database service.
    By leveraging zero data loss Autonomous Recovery Service, the database ensures that the backup remains secure and unaffected by ransomware. This enables rapid restoration of clean, uncompromised data in the event of an attack.
    Periodic patching and upgrades are performed online in a rolling fashion with little to no impact on the application uptime using a combination of Oracle RAC and Oracle Active Data Guard technologies. For all resource-intensive workloads like database backup or generating monthly reports, which are read-only in nature, they can be uploaded to the standby, reducing the load on the production database.
    In the cross-availability zone DR setup, you have the flexibility to configure Active Data Guard to use either the AWS network or the OCI network for keeping database redo logs to the standby database.
    Choosing which network to use for the traffic is entirely at the enterprise discretion. However, both are Oracle maximum availability–compliant and the setup is pretty simple. If the network traffic being used is OCI network or AWS network, then respective cloud provider is responsible for ensuring the reliability.
    You have to take into account the different charges that each cloud provider may have. And you can provision multiple standby databases using the console. Optionally, you may set up a broker manually to enable automatic failover capability.
    14:30
    Nikita: We just covered cross-availability-zone protection. But what if an entire AWS region goes down?
    Rashmi: This is where we can provide an additional level of protection by provisioning cross-region disaster recovery for your Exadata Database Service in Oracle Database@AWS. 
    This deployment protects your database against regional disasters. You can provision another DR environment in a different AWS region that supports Oracle Database@AWS. This deployment, together with the cross-availability zone deployment, complements your highly available and protected database service deployment in Oracle Database@AWS.
    Under the hood, it uses the same Oracle Database technologies that include Oracle Active Data Guard, OCI Autonomous Recovery Service, Oracle Exadata, Oracle RAC to provide the same capabilities as in case of cross-availability zone deployment.
    Here too, you have the flexibility to configure Oracle Active Data Guard to use either AWS network or OCI network for shipping database redo logs to the standby. And for the network traffic options, the feature remains the same, except a small difference with respect to chargeback.
    When using OCI Network for cross-region deployment, there is no charge for the first 10 TB of data transfer per month. Beyond that, standard OCI charges would apply. When using AWS network, you may refer to AWS charging sheet for the cross-region traffic.
    15:49
    Nikita: Thank you so much, Rashmi, for this insightful episode.
    Lois: Yes, thank you! And if you want to dive deeper into the topics we covered today, go to mylearn.oracle.com and search for the Oracle Database@AWS Architect Professional course. Until next time, this is Lois Houston…
    Nikita: And Nikita Abraham, signing off!
    16:13
    That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
  • Oracle University Podcast

    Security and Migration with Oracle Database@AWS

    24/02/2026 | 20 mins.
    In this episode, hosts Lois Houston and Nikita Abraham are joined by special guests Samvit Mishra and Rashmi Panda for an in-depth discussion on security and migration with Oracle Database@AWS. Samvit shares essential security best practices, compliance guidance, and data protection mechanisms to safeguard Oracle databases in AWS, while Rashmi walks through Oracle's powerful Zero-Downtime Migration (ZDM) tool, explaining how to achieve seamless, reliable migrations with minimal disruption.
     
    Oracle Database@AWS Architect Professional: https://mylearn.oracle.com/ou/course/oracle-databaseaws-architect-professional/155574
    Oracle University Learning Community: https://education.oracle.com/ou-community
    LinkedIn: https://www.linkedin.com/showcase/oracle-university/
    X: https://x.com/Oracle_Edu
     
    Special thanks to Arijit Ghosh, Anna Hulkower, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.
     
    -------------------------------------------------------------
     
    Episode Transcript:
    00:00
    Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started!
    00:26
    Nikita: Welcome to the Oracle University Podcast! I'm Nikita Abraham, Team Lead: Editorial Services with Oracle University, and with me is Lois Houston, Director of Communications and Adoption with Customer Success Services.
    Lois: Hello again! We're continuing our discussion on Oracle Database@AWS and in today's episode, we're going to talk about the aspects of security and migration with two special guests: Samvit Mishra and Rashmi Panda. Samvit is a Senior Manager and Rashmi is a Senior Principal Database Instructor. 
    00:59
    Nikita: Hi Samvit and Rashmi! Samvit, let's begin with you. What are the recommended security best practices and data protection mechanisms for Oracle Database@AWS?
    Samvit: Instead of everyone using the root account, which has full access, we create individual users with AWS, IAM, Identity Center, or IAM service.
    And in addition, you must use multi-factor authentication. So basically, as an example, you need a password and a temporary code from virtual MFA app to log in to the console. 
    Always use SSL or TLS to communicate with AWS services. This ensures data in transit is encrypted. Without TLS, the sensitive information like credentials or database queries can be intercepted.
    AWS CloudTrail records every action taken in your AWS account-- who did what, when, and from where. This helps with audit, troubleshooting, and detecting suspicious activity. So you must set up API and user activity logging with AWS CloudTrail. 
    Use AWS encryption solutions along with all default security controls within AWS services. To store and manage keys by using transparent data encryption, which is enabled by default, Oracle Database@AWS uses OCI vaults. Currently, Oracle Database@AWS doesn't support the AWS Key Management Service.
    You should also use advanced managed security services such as Amazon Macie, which assists in discovering and securing sensitive data that is stored in Amazon S3. 
    03:08
    Lois: And how does Oracle Database@AWS deliver strong security and compliance?
    Samvit: Oracle Database@AWS enforces transparent data encryption for all data at REST, ensuring stored information is always protected. Data in transit is secured using SSL and Native Network Encryption, providing end-to-end confidentiality.
    Oracle Database@AWS also uses OCI Vault for centralized and secure key management. This allows organizations to manage encryption keys with fine-grained control, rotation policies, and audit capabilities to ensure compliance with regulatory standards. At the database level, Oracle Database@AWS supports unified auditing and fine-grained auditing to track user activity and sensitive operations.
    At the resource level, AWS CloudTrail and OCI audit service provide comprehensive visibility into API calls and configuration changes. At the database level, security is enforced using database access control lists and Database Firewall to restrict unauthorized connections. At the VPC level, network ACLs and security groups provide layered network isolation and access control. Again, at the database level, Oracle Database@AWS enforces access controls to Database Vault, Virtual Private Database, and row-level security to prevent unauthorized access to sensitive data. And at a resource level, AWS IAM policies, groups, and roles manage user permissions with the fine-grained control.
    05:27
    Lois Samvit, what steps should users be taking to keep their databases secure?
    Samvit: Security is not a single feature but a layered approach covering user access, permissions, encryption, patching, and monitoring.
    The first step is controlling who can access your database and how they connect. At the user level, strong password policies ensure only authorized users can login. And at the network level, private subnets and network security group allow you to isolate database traffic and restrict access to trusted applications only.
    One of the most critical risks is accidental or unauthorized deletion of database resources. To mitigate this, grant delete permissions only to a minimal set of administrators. This reduces the risk of downtime caused by human error or malicious activity.
    Encryption ensures that even if the data is exposed, it cannot be read. By default, all databases in OCI are encrypted using transparent data encryption. For migrated databases, you must verify encryption is enabled and active. Best practice is to rotate the transparent data encryption master key every 90 days or less to maintain compliance and limit exposure in case of key compromise.
    Unpatched databases are one of the most common entry points for attackers. Always apply Oracle critical patch updates on schedule. This mitigates known vulnerabilities and ensures your environment remains protected against emerging threats.
    07:33
    Nikita: Beyond what users can do, are there any built-in features or tools from Oracle that really help with database security?
    Samvit: Beyond the basics, Oracle provides powerful database security tools. Features like data masking allow you to protect sensitive information in non-production environments. Auditing helps you monitor database activity and detect anomalies or unauthorized access.
    Oracle Data Safe is a managed service that takes database security to the next level. It can access your database configuration for weaknesses. It can also detect risky user accounts and privileges, identify and classify sensitive data. It can also implement controls such as masking to protect that data. And it can also continuously audit user activity to ensure compliance and accountability.
    Now, transparent data encryption enables you to encrypt sensitive data that you store in tables and tablespaces. It also enables you to encrypt database backups. After the data is encrypted, this data is transparently decrypted for authorized users or applications when they access that data.
    You can configure OCI Vault as a part of the transparent data encryption implementation. This enables you to centrally manage keystore in your enterprise. So OCI Vault gives centralized control over encryption keys, including key rotation and customer managed keys.
    09:23
    Lois: So obviously, lots of companies have to follow strict regulations. How does Oracle Database@AWS help customers with compliance? 
    Samvit: Oracle Database@AWS has achieved a broad and rigorous set of compliance certifications. The service supports SOC 1, SOC 2, and SOC 3, as well as HIPAA for health care data protection. If we talk about SOC 1, that basically covers internal controls for financial statements and reporting. SOC 2 covers internal controls for security, confidentiality, processing integrity, privacy, and availability.
    SOC 3 covers SOC 2 results tailored for a general audience. And HIPAA is a federal law that protects patients' health information and ensures its confidentiality, integrity, and availability. It also holds certifications and attestations such as CSA STAR, C5.
    Now C5 is a German government standard that verifies cloud providers meet strict security and compliance requirements. CSA STAR attestation is an independent third-party audit of cloud security controls. CSA STAR certification also validates a cloud provider's security posture against CSA's cloud controls matrix. And HDS is a French certification that ensures cloud providers meet stringent requirements for hosting and protecting health care data.
    Oracle Database@AWS also holds ISO and IEC standards. You can also see PCI DSS, which is basically for payment card security and HITRUST, which is for high assurance health care framework. So, these certifications ensure that Oracle Database@AWS not only adheres to best practices in security and privacy, but also provides customers with assurance that their workloads align with globally recognized compliance regimes.
    11:47
    Nikita: Thank you, Samvit. Now Rashmi, can you walk us through Oracle's migration solution that helps teams move to OCI Database Services?
    Rashmi: Oracle Zero-Downtime Migration is a robust and flexible end-to-end database migration solution that can completely automate and streamline the migration of Oracle databases. With bare minimum inputs from you, it can orchestrate and execute the entire migration task, virtually needing no manual effort from you.
    And the best part is you can use this tool for free to migrate your source Oracle databases to OCI Oracle Database Services faster and reliably, eliminating the chances of human errors. You can migrate individual databases or migrate an entire fleet of databases in parallel.
    12:34
    Nikita: Ok. For someone planning a migration with ZDM, are there any key points they should keep in mind? 
    Rashmi: When migrating using ZDM, your source databases may require minimal downtime up to 15 minutes or no downtime at all, depending upon the scenario. It is built with the principles of Oracle maximum availability architecture and leverages technologies like Oracle GoldenGate and Oracle Data Guard to achieve high availability and online migration workflow using Oracle migration methods like RMAN, Data Pump, and Database Links.
    Depending on the migration requirement, ZDM provides different migration method options. It can be logical or physical migration in an online or offline mode. Under the hood, it utilizes the different database migration technologies to perform the migration.
    13:23
    Lois: Can you give us an example of this?
    Rashmi: When you are migrating a mission critical production database, you can use the logical online migration method. And when you are migrating a development database, you can simply choose the physical offline migration method.
    As part of the migration job, you can perform database upgrades or convert your database to multitenant architecture. ZDM offers greater flexibility and automation in performing the database migration.
    You can customize workflow by adding pre or postrun scripts as part of the workflow. Run prechecks to check for possible failures that may arise during migration and fix them. Audit migration jobs activity and user actions. Control the execution like schedule a job pause, resume, if needed, suspend and resume the job, schedule the job or terminate a running job. You can even rerun a job from failure point and other such capabilities.
    14:13
    Lois: And what kind of migration scenarios does ZDM support?
    Rashmi: The minimum version of your source Oracle Database must be 11.2.0.4 and above. For lower versions, you will have to first upgrade to at least 11.2.0.4. You can migrate Oracle databases that may be of the Standard or Enterprise edition.
    ZDM supports migration of Oracle databases, which may be a single-instance, or RAC One Node, or RAC databases. It can migrate on Unix platforms like Linux, Oracle Solaris, and AIX. For Oracle databases on AIX and Oracle Solaris platform, ZDM uses logical migration method.
    But if the source platform is Linux, it can use both physical and logical migration method. You can use ZDM to migrate databases that may be on premises, or in third-party cloud, or even within Oracle Cloud Infrastructure. ZDM leverages Oracle technologies like RMAN datacom, Database Links, Data Guard, Oracle GoldenGate when choosing a specific migration workflow.
    15:15
    Are you ready to revolutionize the way you work? Discover a wide range of Oracle AI Database courses that help you master the latest AI-powered tools and boost your career prospects. Start learning today at mylearn.oracle.com.
    15:35
    Nikita: Welcome back! Rashmi, before someone starts using ZDM, is there any prep work they should do or things they need to set up first?
    Rashmi: Working with ZDM needs few simple configuration. Zero-downtime migration provides a command line interface to run your migration job. First, you have to download the ZDM binary, preferably download from my Oracle Support, where you can get the binary with the latest updates.
    Set up and configure the binary by following the instructions available at the same invoice node. The host in which ZDM is installed and configured is called the zero-downtime migration service host. The host has to be Oracle Linux version 7 or 8, or it can be RCL 8.
    Next is the orchestration step where connection to the source and target is configured and tested like SSH configuration with source and target, opening the ports in respective destinations, creation of dump destination, granting required database privileges. Prepare the response file with parameter values that define the workflow that ZDM should use during Oracle Database migration.
    You can also customize the migration workflow using the response file. You can plug in run scripts to be executed before or after a specific phase of the migration job. These customizations are called custom plugins with user actions.
    Your sources may be hosted on-premises or OCI-managed database services, or even third-party cloud. They may be Oracle Database Standard or Enterprise edition and on accelerator infrastructure or a standard compute.
    The target can be of the same type as the source. But additionally, ZDM supports migration to multicloud deployments on Oracle Database@Azure, Oracle Database@Google Cloud, and Oracle Database@AWS.
    You begin with a migration strategy where you list the different databases that can be migrated, classification of the databases, grouping them, performing three migration checks like dependencies, downtime requirement versions, and preparing the order migration, the target migration environment, et cetera.
    17:27
    Lois: What migration methods and technologies does ZDM rely on to complete the move?
    Rashmi: There are primarily two types of migration: physical or logical.
    Physical migration pertains to copy of the database OS blocks to the target database, whereas in logical migration, it involves copying of the logical elements of the database like metadata and data.
    Each of these migration methods can be executed when the database is online or offline. In online mode, migration is performed simultaneously while the changes are in progress in the source database.
    While in offline mode, all changes to the source database is frozen. For physical offline migration, it uses backup and restore technique, while with the physical online, it creates a physical standby using backup and restore, and then performing a switchover once the standby is in sync with the source database.
    For logical offline migration, it exports and imports database metadata and data into the target database, while in logical online migration, it is a combination of export and import operation, followed by apply of incremental updates from the source to the target database. The physical or logical offline migration method is used when the source database of the application can allow some downtime for the migration.
    The physical or logical online migration approach is ideal for scenarios where any downtime for the source database can badly affect critical applications. The only downtime that can be tolerated by the application is only during the application connection switchover to the migrated database.
    One other advantage is ZDM can migrate one or a fleet of Oracle databases by executing multiple jobs in parallel, where each job workflow can be customized to a specific database need. It can perform physical or logical migration of your Oracle databases. 
    And whether it should be performed online or offline depends on the downtime that can be approved by business.
    19:13
    Nikita: Samvit and Rashmi, thanks for joining us today.
    Lois: Yeah, it's been great to have you both. If you want to dive deeper into the topics we covered today, go to mylearn.oracle.com and search for the Oracle Database@AWS Architect Professional course. Until next time, this is Lois Houston…
    Nikita: And Nikita Abraham, signing off!
    19:35
    That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

More Education podcasts

About Oracle University Podcast

Oracle University Podcast delivers convenient, foundational training on popular Oracle technologies such as Oracle Cloud Infrastructure, Java, Autonomous Database, and more to help you jump-start or advance your career in the cloud.
Podcast website

Listen to Oracle University Podcast, The Rich Roll Podcast and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features

Oracle University Podcast: Podcasts in Family

Social
v8.8.3 | © 2007-2026 radio.de GmbH
Generated: 3/24/2026 - 1:40:09 PM