Its syntax is as follows: apache hive related article tags - hive tutorial - hadoop hive - hadoop hive - hiveql - hive hadoop - learnhive - hive sql. Run Hive Queries using Visual Studio Once HDInsight cluster is configured, we generally use either the portal dashboard (Powered by Ambari) or a tool like PuTTY for executing queries against data loaded. There are multiple ways to load data into hive. If you haven't created a database for the Hive metastore, create one now. 0 and later: After Mammoth Upgrade Hive Metastore Server is in Bad Health with "The Hive Metastore canary fail After Mammoth Upgrade Hive Metastore Server is in Bad Health with "The Hive Metastore canary failed to create a database. When not configured by the Hive-site. Download SQuirreL SQL Client for free. Some links, resources, or references may no longer be accurate. To write and execute a Hive script, we need to install Cloudera distribution for Hadoop CDH4. OR [on SLES] kdb5_util create -s. Macros— a function that uses other Hive functions. Hive provides a database query interface to Apache Hadoop. SIGN IN or Sign up using. If a database with the same name already exists, an exception will be thrown. Creating Hive Tables Use the CREATE TABLE HiveQL statement to create tables that project into the files to be used. For valid identifiers to use as database names, see Identifier Names. SQL Server databases are some of the most common databases in use, thanks in part to how easy it is to create and maintain them. This is also known as a temporary table or a view. When Windows 10 came along, mandatory profiles had been completely and utterly forgotten about, and simply didn’t work. Using this URL information, I have finally been able to export HIVE data to a CSV (Comma Separated Values) file. On the left, locate and select/click on the View File Location action. Creating database "guru99" in Hive; Displaying existing databases using "show" command ; In the same screen, Database "guru99" name is displayed at the end when we execute the show command. Hive: SQL for Hadoop Dean Wampler Wednesday, May 14, 14 I'll argue that Hive is indispensable to people creating "data warehouses" with Hadoop, because it gives them a "similar" SQL interface to their data, making it easier to migrate skills and even apps from existing relational tools to Hadoop. Get insights into your competition. Open SQL Server Management Studio. And we can load data into that table later. In this article we will learn How to create Hive table for parquet file format data. 0 and later come with the PostgreSQL JDBC driver in the Hive library path. There are a few things you need to know about the Hive internal table and external table: The CREATE TABLE command creates an internal table. 1 - using jtds-1. So other users should not be able to see those tables or do anything with them. Objective: Creating Hive tables is really an easy task. This tutorial was written using SQL SERVER 2012 and a three node Hortonworks cluster running HDFS 2. Select and load data from an Apache Hive database. A database in Hive is a namespace or a collection of tables. Overhead of creating additional user group, setting up LDAP or Kerberos for authentication. output=false; create table csv_dump ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY ' ' as select * from…. CREATE EXTERNAL TABLE myTable (key STRING, value INT) LOCATION 'oci://[email protected]/myDir/' where myDir is a directory in the bucket mybucket. With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options SQL> create public database link hivedsn connect to cloudera identified by cloudera using 'HIVEDSN'; Database link created. In the above step, we have created a hive table named Students in the database college with various fields like ID, Name, fee, city, etc. Use PURGE option to skip trash. name" in Hive 0. When ants, displaying never-before-seen behavior, seize an island, the controversial Thorax Team is called to stop the massive threat only to discover that the ants are controlled by something beyond this world. After completing the previous steps to create and configure the s3 protocol, you can specify an s3 protocol URL in the CREATE EXTERNAL TABLE command to define S3 external tables. How to Create a Database in MySQL. In order to build an SSAS cube with data from Hive in your Hadoop environment, we need to use SQL Server as an interface, which can be done through adding a Linked Server to the Hive instance. To clarify, the database backend is an OLTP database used by Superset to store its internal information like your list of users, slices and dashboard definitions. This effectively performs the " --hive-import " step of sqoop-import without running the preceeding import. Spark is an Apache project advertised as “lightning fast cluster computing”. Here's an example odbc. hive> SHOW CREATE TABLE test_view; OK CREATE VIEW `test_view` AS SELECT FROM `default`. 6 (HIVE-675). And it is required to send the data of infostore table into that application. However, they are very useful for larger clusters with multiple teams and users, as a way of avoiding table name collisions. In this article we will learn How to create Hive table for parquet file format data. The SHOW DATABASES statement lists all databases, or the databases whose name matches a wildcard pattern. It allows users to ingest and analyse large amounts of data in a distributed environment using SQL. Get the linux MySQL ODBC driver. db directory to the new database name and guess what, when I use the newly renamed database, use any table and run a select * query on it, it doesn't return any value. Creating an index is common practice with relational databases when you want to speed access to a column or set of columns in your database. HiveQL is powered by Apache Hive. With external tables, you can access data stored in HDFS and Hive tables on Hadoop and Hive clients as if that data was stored in tables in an Oracle Database. From this I am not sure weather Hive itself is a database solution like HBase, Cassnadra. Apache ZooKeeper is an effort to develop and maintain an open-source server which enables highly reliable distributed coordination. Create a new folder called Big Data. Find all column fields with only nulls in SQL Server table. Hive also provides a default database with a name default. Hive Temporary Tables are used to store intermediate or Temporary complex query results which we don't want to store it inside database tables permanently, the Temporary table exists only on the particular session or Terminal window, where it is being created and used, once you close the session/terminal you will not be able to see the temp table in the Database or any where else and we. Apache Hive is a data warehouse infrastructure built on top of Hadoop for providing data summarization, query, and analysis. HiveQL is powered by Apache Hive. Hive CREATE DATABASE is used to create DataBase in hive, each Database in hive will have set of tables like RDBMS Databases and Tables. Tableau supports TIMESTAMP and DATE types natively. Relational Database Design ; In database terms, a query is used to retrieve data from the database. ORC format improves the performance when Hive is processing the data. Hive is a database technology that can define databases and tables to analyze structured data. Create a JSON configuration file similar to the following. 2 the delivery side is slightly different and new options when replicating into Hadoop as well. Hive Tables can be created with CREATE TABLE command which is very similar to MySQL syntax with some hive extensions. A copy of an existing table can also be created using CREATE TABLE. 前言:Hive中对database的创建、查询、location、修改、删除等操作,作了简单描述. Create a database application afresh or import data from standard formats (. Create Dynamic Hive Grids Using the Infragistics XamDataGrid Use the CData ADO. Hortonworks is a leading innovator in the industry, creating, distributing and supporting enterprise-ready open data platforms and modern data applications. With external tables, you can access data stored in HDFS and Hive tables on Hadoop and Hive clients as if that data was stored in tables in an Oracle Database. Some tables may be missing from your MetaStore schema if you have not created the corresponding Hive catalog objects, e. You can use either the Web edition or the Business edition. permission denied error in hive while creating database with resolution in hadoop eco-system. Command :. The database directory is created under the directory specified in the parameter “hive. logger=DEBUG,console. Talend (big data edition) integration with Hive on hadoop – Part#1 (Write data into Hive) I will write on how to use talend to connect to Hive database running on Hadoop and create a table and insert/load data into this table. Hive is a database summarizing tool that that supports databases and tables to analyze structured data. Apache Phoenix supports mapping to an existing HBase table through the CREATE TABLE and CREATE VIEW DDL statements. But in Hive, we can insert data using the LOAD DATA statement. Now, instead of challenging MapReduce code, you can design a star schema data warehouse or a normalized database. Currently PL/HQL converts data types, removes NOT NULL/NULL, constraints and default values. All Hive implementations need a metastore service, where it stores metadata. dt < today(); All these abilities make Hive a very flexible and extensible system when it comes to supporting different data formats, data types or functions. 7, MapReduce2 2. But that setting is recommended just for the testing or ad-hoc development purpose. Figure 3: HD Insight Hive Editor. Create database Example: -Create database "guru99" From the above screen shot, we are doing two things. However, it is not part of the data itself but is derived from the partition that a particular dataset is loaded into. Sqoop is a… My Big Data World. The database in Hive describes a collection of tables that are used for a similar purpose or belong to the same groups. Audit results of activity and performance monitoring provide an extra security measure and forensic evidence in case of cybersecurity incidents. I can connect and perform all queries (inserts, selects, etc. An ORC file contains group of rows data which is called as Stripes along with a file footer. We create table using SerDe properties by specifying the lengths of each column. The Hortonworks cluster is running on CentOS 7. sh file - Hive_SH. This effectively performs the " --hive-import " step of sqoop-import without running the preceeding import. The following listing shows you how it’s done. 2 and try it out if you are moving data between Apache Hadoop and. In this Hive Function cheat sheet, you'll learn about date functions, mathematical functions, collection functions, conditional functions, functions for text analytics, and more. On the left, locate and select/click on the View File Location action. Hive - Create Database. Replace the RDS endpoint in the JDBC URL with the endpoint for your RDS instance. Directed by Peter Manus. This is a step by step guide on How to Configure MySQL Metastore for Hive in place of Derby Metastore (Default). hive" % "hive-jdbc" % "0. The Hive Connector also includes support for creating tables with different Row formats and data storage formats: The default stage properties for a source Hive Connector stage are shown below: File Connector (Hive Table create option) The File Connector has an option to create an External Hive table on top of HDFS data that it loads. Let’s get started, log into the Hadoop cluster via ssh. There are several scenarios when a traditional database like SQL Server or Oracle is not the best way to store data. There are 2 types of tables in Hive, Internal and External. You can also create the table hive while importing data using Sqoop command. Enter the Hive command line by typing hive at the Linux prompt: 2. Use of SHOW CREATE VIEW requires the SHOW VIEW privilege, and the SELECT privilege for the view in question. Displaying the Contents of the Table Company1Hive Now, let us see the transferred contents in the table Company1Hive. Hi Friends, I am a data modeler and new to Hadoop Hive and using CA Standard Data Modeler 9. All Hive implementations need a metastore service, where it stores metadata. Step1 : Create a temporary table in Hive Step 2: Create a ORC foramtted table in Hive. Note: This feature is available in the Toad Data Point Professional Edition only. If you have an “ SQL-closed ” database (that is, your own application code is the only source for newly inserted records), then you can leave out the trigger, but then you should always obtain a new generator value from the database before issuing the Insert statement and include it there. This article provides an introduction to Spark including use cases and examples. With the alternate form of the CREATE TABLE statement, the column names and/or the column data types can be specified by providing a query. We need to use stored as Parquet to create a hive table for Parquet file format data. hive> create database serviceorderdb; OK Time taken: 1. Usually, you will be selective with how much data you want returned. the PARTITIONS table will probably not exist if you have not created any table partitions in your. To create a Hive connection. Any database which is supported by Sqoop 1. The IKM SQL to HDFS Hive (Sqoop) is designed to import data from a relational database into a Hive table. GitHub Gist: instantly share code, notes, and snippets. What we're saying here is that we want all the rows in a day, separated out in a separate directory and file(s). You can find installation instructions here. For a list of supported components in HDInsight, see What's new in the Hadoop cluster versions provided by HDInsight? If you don't have an Azure subscription, create a free account before you begin. Create Database Statement. Hive Database. CREATE TEMPORARY FUNCTION today AS 'yourpackage. Step1 : Create a temporary table in Hive Step 2: Create a ORC foramtted table in Hive. 0 and later: After Mammoth Upgrade Hive Metastore Server is in Bad Health with "The Hive Metastore canary fail After Mammoth Upgrade Hive Metastore Server is in Bad Health with "The Hive Metastore canary failed to create a database. Remember to also create any required indexes, constraints and referential integrity. On the Linux cluster create a new user and add that user to the hadoop group. We will need this to create our External Table. This is part 1 of a 2 part series for how to update Hive Tables the easy way Historically, keeping data up-to-date in Apache Hive required custom application development that is complex, non-performant […]. By default, Hive uses a built-in Derby SQL server. Writing Hive Scripts. We can switch database using use database; command; To know the current working database we can get using SELECT current_database() To see the DDL used for create table statement we can use SHOW CREATE TABLE tablename. It only helps in preventing users from accidentally doing operations they are not supposed to do. Specifically, the Hive connection is pointed directly to the Hive Server or Hive Server 2 port. Drop Database is a statement that drops all the tables and deletes the database. Hi, Can we source/target the data from/to HIVE thru PowerCenter Designer? We have used hortonworks odbc driver and imported from HIVE database in PowerCenter designer, but at the session level there is no provision to select the HIVE Application connection under Readers/Connections section. Creating Hive tables is a common experience to all of us that use Hadoop. one more application is connected to your application, but it is not allowed to take the data from hive table due to security reasons. CREATE TEMPORARY FUNCTION today AS 'yourpackage. In Qlik Sense, you load data through the Add data dialog or the Data load editor. In HDInsight, we use Azure SQL database as Hive Metastore. Apache Hive is one of the NoSQL data sources supported by Toad. Objective: Creating Hive tables is really an easy task. DatabaseSpy includes a new command to create complete DDL scripts for a database schema. UDTF— takes zero or more inputs and and produces multiple columns or rows of output. Hadoop hive create, drop, alter, use database commands are database DDL commands. Hive Connection Help. We do not need to create this database. sql [[email protected] ~]# su hive [[email protected] testbkUP]$ mysqldump --databases --single-transaction > hive_databases. I need to create a SYSTEM HIVE file from my. For details about Hive support, see Apache Hive Compatibility. This is similar to the External Tables of Oracle, where we create the structure of the table similar to the format of the txt or csv file. My requirement is for each linux user I need to create a database on hive and give access to only that user(or group). Hive Query Language Reference Follow New articles New articles and comments. It will store the data frame into hive database bdp_db with the table name "jsonTest". Subject: hive permissions issue on a database I have a Hadoop cluster running CDH4 version. and synaptic will provide a similar install. This chapter explains how to create Hive database. " The Hive Metastore canary failed to create a database. We can switch database using use database; command; To know the current working database we can get using SELECT current_database() To see the DDL used for create table statement we can use SHOW CREATE TABLE tablename. You can also use preemptible VMs for noncritical data processing or to create very large clusters at a lower total cost. A local table is not accessible from other clusters and is not registered in the Hive metastore. To specify a database for the table, either issue the USE database_name statement prior to the CREATE TABLE statement (in Hive 0. Below points can be usefull. Create an Apache Hive connection. You can use either the Web edition or the Business edition. We should be able to assign key-value pairs of properties to Hive databases. Create a Job to Load Hive. the PARTITIONS table will probably not exist if you have not created any table partitions in your. This chapter explains how to create Hive database. Apache Hive is one of the NoSQL data sources supported by Toad. How to Export and Import Postgresql Database. [on RHEL,CentOS, or Oracle Linux] /usr/sbin/kdb5_util create -s. However, Sqoop can also import data from a relational database into a Hive data warehouse. 12 [HEIR OF LIGHT]Special. IF NOT EXISTS If a database with the same name already exists, nothing will happen. Then proceed as follows:. Hive provides views for summarization and ad-hoc queries for data analysis. Create Hive database and tables The Hive queries are shared in the GitHub repository and can be downloaded from there. Whether it's a personal list of phone numbers, a contact list for an organization, or a collection of coins, Microsoft Excel has built-in tools to keep track of data and find specific information. Microsoft SSIS (SQL Server Integration Services) is a tool used to move data to and from other sources to Microsoft SQL Server. 6 (HIVE-675). Allow the user to select from a list of courses available. Hive Tables can be created with CREATE TABLE command which is very similar to MySQL syntax with some hive extensions. Drop Database is a statement that drops all the tables and deletes the database. I need a suitable table, loaded with appropriate data to demonstrate HQL. Previously it was a subproject of Apache® Hadoop® , but has now graduated to become a top-level project of its own. Creating Hive tables is a common experience to all of us that use Hadoop. Tables in that database will be stored in sub directories of the database directory. Autodidact and Open Source Evangelist. Example: Convert SQL and create a table in Hive:. With external tables, you can access data stored in HDFS and Hive tables on Hadoop and Hive clients as if that data was stored in tables in an Oracle Database. The above Sqoop command will create a new table with the name Company1Hive in the Hive default database and transfer the 3 mentioned column (EmpId, EmpName and City) values from the MySQL table Company1 to the Hive table Company1Hive. Create Query to Populate Hive Table temp_drivers with drivers. Hive Create Database. I have practically achieved the result and have seen the effective performance of hive ORC table. In order to build an SSAS cube with data from Hive in your Hadoop environment, we need to use SQL Server as an interface, which can be done through adding a Linked Server to the Hive instance. ORC format. This example will access the data in our sample database, myDB,. All Hive implementations need a metastore service, where it stores metadata. Directed by Peter Manus. In the above step, we have created a hive table named Students in the database college with various fields like ID, Name, fee, city, etc. Create Hive database and tables The Hive queries are shared in the GitHub repository and can be downloaded from there. Get started in three ways. Dropping of External table does not remove the data from the storage. Hive is a database technology that can define databases and tables to analyze structured data. The following two types of tables can be created: Internal tables manage the lifetime of the underlying folders. Hive : Create database in different folder August 28, 2015 September 1, 2015 gchandra 0 Comments custom, Database, hive, path. Right click on the object in this view and click Create Process: Retrieve to instantly create a new Radoop process that uses this data as an input. Once you've become familiar with starting the network server and running SQL queries, you'll want to use Derby from within a Java application. I wish I could just configure it to not use hive, but I could not find a way. In this example we will use the Flexter XML converter to generate a Hive schema and parse an XML file into a Hive database. Thus, databases are an implicit part of the Kudu table name. Developers Company. After dropping an external table, the data is not gone. DBMS > Hive vs. Hive is not a full database. I only mention this because I've heard of many cases where people use Drupal Views to create SQL, and then use that SQL in other places. Use PURGE option to skip trash. Create a Job to Load Hive. Create Table is a statement used to create a table in Hive. hive> SHOW CREATE TABLE test_view; OK CREATE VIEW `test_view` AS SELECT FROM `default`. We can switch database using use database; command; To know the current working database we can get using SELECT current_database() To see the DDL used for create table statement we can use SHOW CREATE TABLE tablename. Using JDBC Program we can create database in java is as shown below. Create a new hive database called moviework. Can I change the datatype in Hive database? Below is a complete information about the same. Access Google Sites with a free Google account (for personal use) or G Suite account (for business use). The following listing shows you how it's done. Allow the user to select from a list of courses available. The Master Catalog is where the Base data model is designed in the form of subject areas. Let's Upload the files using the PHP and save the files to the MySql database. Hive is a database technology that can define databases and tables to analyze structured data. Hive uses MAP/REDUCE. Click on the policy group name under Hive. Partition Hive tables and use the Optimized Row Columnar (ORC) formatting to improve query performance. the "input format" and "output format". Creating Hive tables is a common experience to all of us that use Hadoop. db directory to the new database name and guess what, when I use the newly renamed database, use any table and run a select * query on it, it doesn't return any value. Apache Hive is a data warehouse software project built on top of Apache Hadoop for providing data query and analysis. Creating an Avro table in Hive automatically Created Mon, Jan 16, 2017 Last modified Mon, Jan 16, 2017 Hive , Sqoop Hadoop My goal was to create a process for importing data into Hive using Sqoop 1. Create a JSON configuration file similar to the following. Install Hive with MySQL MetaStore Apache Hive Metastore It consists of relational database for store the data (such as Hive tables schema,partition, bucket) and Metastore Service API for accessing information stored in relational database. Hive looks like traditional database code with SQL access. A command line tool and JDBC driver are provided to connect users to Hive. Zoho Creator provides three different ways to create databases. This post will discuss a basic scenario in Hive: Dump some data from Oracle database, load to HDFS, and query the data using Hive. Hive provides views for summarization and ad-hoc queries for data analysis. Create a New Project in Intellij IDEA. ROW FORMAT should have delimiters used to terminate the fields and lines like in the above example the fields are terminated with comma (“,”). Whether it's a personal list of phone numbers, a contact list for an organization, or a collection of coins, Microsoft Excel has built-in tools to keep track of data and find specific information. If you work with star schema in Hive/Impala, take a step back and reflect if you need to and if the overall data flow in your infrastructure is correct. Hadoop is a framework for handling large datasets in a distributed computing environment. As a result, all writes to the database trigger local events immediately, before any data is written to the server. a because it is in an encryption zone and trash is enabled. Talend (big data edition) integration with Hive on hadoop – Part#1 (Write data into Hive) I will write on how to use talend to connect to Hive database running on Hadoop and create a table and insert/load data into this table. com before the merger with Cloudera. Schema or Database¶ GET /api/v1. Note: This feature is available in the Toad Data Point Professional Edition only. A local table is not accessible from other clusters and is not registered in the Hive metastore. RazorSQL can connect to Apache Hive databases via JDBC on Windows, Mac, and Linux operating systems. How to Create a Database in MySQL. Make a note that below HiveQL syntax is case insensitive but just for better readability, we have used keywords in uppercase. Whether it's a personal list of phone numbers, a contact list for an organization, or a collection of coins, Microsoft Excel has built-in tools to keep track of data and find specific information. Sqoop does have a create-hive-table tool which can create a Hive schema. You can also use preemptible VMs for noncritical data processing or to create very large clusters at a lower total cost. ROW FORMAT should have delimiters used to terminate the fields and lines like in the above example the fields are terminated with comma (“,”). A database in Hive is a namespace or a collection of tables. Apache Hive - Transactions in Hive (Insert, update and delete) Programming in Visual Basic. Directed by Peter Manus. This project is intended to be a minimal Hive/Presto client that does that one thing and nothing else. Developers Company. Through the java. With a free graphical user interface (GUI) program such as SQL Server. Creating a Custom Hive Metastore¶. The syntax for this statement is as follows:. Without an index, the database system has to read all rows in the table to find the data you have selected. hive> CREATE DATABASE test_hive_db; OK Time taken: 0. The -s option stores the master server key for the database in a stash file. Creating Internal Table. Using keywords and various sorting commands, users can rapidly search, rearrange, group, and select the fields in many records to retrieve or create reports on particular aggregates of. It is a data warehouse designed for batch jobs with high latency. Thus, databases are an implicit part of the Kudu table name. The design rules and regulations of Hadoop and HDFS put restrictions on what Hive can do. jars to use this directory. • Building out resources to position Hive as a thought leader in the email marketing space • Aligning product ui and marketing design for a seamless brand experience • Crafting a gated content strategy to generate leads through Hive's marketing channels Lead Hive's marketing design and content strategy:. Five Best Mind Mapping Tools Mind mapping is a great way to brainstorm, make a plan, or turn. db directory to the new database name and guess what, when I use the newly renamed database, use any table and run a select * query on it, it doesn't return any value. The default authorization in Hive is not designed with the intent to protect against malicious users accessing data they should not be accessing. A Hive editor (similar to SQL Server Management Studio, or SSMS) may be launched from the command line, through Hue, or from the Manage Cluster link in the HDInsight Dashboard (see Figure 3). Hive internal table : The concept of a table in Hive is very similar to the table in the relational database. Hive uses MAP/REDUCE. Create Database Statement. sql-server,oracle,azure,oracle-sqldeveloper I cannot seem to browse the tables on my Azure SQL Server database from within Oracle SQL Developer (Version 4. The functionalities such as filtering, joins can be performed on the tables. ORC format improves the performance when Hive is processing the data. Simple Hive Table Creation Syntax is follows. Apache Hive Metastore is normally configured with Derby Database. As a result, we have seen all Hive DDL commands: Create Database Statement, Hive Show Database, Drop database, Creating Hive Tables, Browse the table, Altering and Dropping Tables, Hive Select Data from Table, and Hive Load Data with syntax and examples. Hive offers a SQL-like query language called HiveQL, which is used to analyze large, structured datasets. Once the data is loaded into the table, you will be able to run HiveQL statements to query this data. The proposed syntax is similar to the CREATE TABLE and CREATE INDEX commands: CREATE DATABASE WITH DBPROPERTIES ( 'key1' = 'value1' , 'key2' = 'value2' );. Displaying the current list of live streams. Pick from our gallery of one hundred databases and customize them to suit your needs. IF NOT EXISTS If a database with the same name already exists, nothing will happen. Cases to avoid are where Groupings are used. logger=DEBUG,console. An Object View tab for the new connection is opened, Enter a name for the connection in the Name field, and optionally enter a description of the connection in the Notes field, Leave the Database Type as Auto Detect,. How to run Hive queries using shell script. CREATE TABLE temps_orc_partition_date. Avoid copying and pasting your existing data storage and processing strategies from RDBMS to Hive and Impala. Currently PL/HQL converts data types, removes NOT NULL/NULL, constraints and default values. With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options SQL> create public database link hivedsn connect to cloudera identified by cloudera using 'HIVEDSN'; Database link created. the PARTITIONS table will probably not exist if you have not created any table partitions in your. To configure a PostgreSQL DB instance as the external metastore for Hive: 1.