成人午夜视频全免费观看高清-秋霞福利视频一区二区三区-国产精品久久久久电影小说-亚洲不卡区三一区三区一区

【hadoop】hive安裝實踐

1.下載Hive安裝包:

創(chuàng)新互聯(lián)客戶idc服務(wù)中心,提供成都多線機房、成都服務(wù)器、成都主機托管、成都雙線服務(wù)器等業(yè)務(wù)的一站式服務(wù)。通過各地的服務(wù)中心,我們向成都用戶提供優(yōu)質(zhì)廉價的產(chǎn)品以及開放、透明、穩(wěn)定、高性價比的服務(wù),資深網(wǎng)絡(luò)工程師在機房提供7*24小時標準級技術(shù)保障。

官網(wǎng)下載:http://hive.apache.org/downloads.html

2.上傳Hive的tar包,并解壓:
建議和hadoop目錄在一級,方便后續(xù)使用;

解壓:tar -zxvf apache-hive-1.2.1-bin.tar.gz -C /home/hadoop/hive

修改解壓后的文件名稱:mv apache-hive-1.2.1-bin hive-1.2.1

3.安裝MySQL:
MySQL用于存儲Hive的元數(shù)據(jù),(安裝教程見之前的文章)

4.修改配置文件:主要是配置metastore(元數(shù)據(jù)存儲)存儲方式
4.1. vi /home/hadoop/hive/hive-1.2.1/conf/hive-site.xml(存儲方式:內(nèi)嵌Derby方式、本地mysql、遠端mysql)

4.2 粘貼如下內(nèi)容:

<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
        <description>JDBC connect string for a JDBC metastore</description>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
        <description>Driver class name for a JDBC metastore</description>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>root</value>
        <description>username to use against metastore database</description>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>root</value> 
        <description>password to use against metastore database</description>
    </property>
</configuration>

5.拷貝jar包:

拷貝mysql驅(qū)動jar包到Hive的lib目錄下面去,

    下載路徑:https://pan.baidu.com/s/17iHOIjt4XZbRAngGFf_GgA

6.啟動Hive:
(1)啟動Hive之前需要先把Hadoop集群啟動起來。

    (2)使用hadoop用戶

啟動命令:/usr/local/src/hive-1.2.1/bin/hive

出現(xiàn)如下表示啟動成功:
hive>

7、驗證Hive運行正常:啟動Hive以后輸入下面的命令:

hive> show databases;
OK
default
test_db
Time taken: 0.567 seconds, Fetched: 2 row(s)

hive> use default;
OK
Time taken: 0.068 seconds

hive> show tables;
OK
Time taken: 0.086 seconds

8、 創(chuàng)建數(shù)據(jù)庫, 數(shù)據(jù)庫的數(shù)據(jù)文件被存放在HDFS的/user/hive/warehouse/test_db.db下面

hive> create database test_db;
OK
Time taken: 0.505 seconds

9、在test_db里創(chuàng)建表,表的數(shù)據(jù)文件被存放在HDFS的/user/hive/warehouse/test_db.db/t_test下面;
并且表的數(shù)據(jù)文件字段以“|”分割開;

use test_db;

create table flat1_test (mobile string,opr_type string,lastupdatetime string,monthly string,sp_code string,oper_code string,unknown string,subtime string)
row format delimited
fields terminated by '|';

10、上傳數(shù)據(jù)文件到hdfs指定目錄,目錄為hive數(shù)據(jù)庫表文件目錄
hadoop fs -put hivefile1.txt /user/hive/warehouse/test_db.db/flat1_test

11、使用sql查詢數(shù)據(jù)
hive> select * from flat1_test;

12、查詢Hive的元數(shù)據(jù),進入mysql中查詢

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| hive               |
| mysql              |
| performance_schema |
| test               |
+--------------------+
5 rows in set (0.00 sec)

mysql> use hive;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> 
mysql> show tables;
+---------------------------+
| Tables_in_hive            |
+---------------------------+
| BUCKETING_COLS            |
| CDS                       |
| COLUMNS_V2                |
| DATABASE_PARAMS           |
| DBS                       |
| FUNCS                     |
| FUNC_RU                   |
| GLOBAL_PRIVS              |
| IDXS                      |
| INDEX_PARAMS              |
| PARTITIONS                |
| PARTITION_KEYS            |
| PARTITION_KEY_VALS        |
| PARTITION_PARAMS          |
| PART_COL_PRIVS            |
| PART_COL_STATS            |
| PART_PRIVS                |
| ROLES                     |
| SDS                       |
| SD_PARAMS                 |
| SEQUENCE_TABLE            |
| SERDES                    |
| SERDE_PARAMS              |
| SKEWED_COL_NAMES          |
| SKEWED_COL_VALUE_LOC_MAP  |
| SKEWED_STRING_LIST        |
| SKEWED_STRING_LIST_VALUES |
| SKEWED_VALUES             |
| SORT_COLS                 |
| TABLE_PARAMS              |
| TAB_COL_STATS             |
| TBLS                      |
| TBL_COL_PRIVS             |
| TBL_PRIVS                 |
| VERSION                   |
+---------------------------+
35 rows in set (0.01 sec)

mysql> select * from DBS;
+-------+-----------------------+-----------------------------------------------------------+---------+------------+------------+
| DB_ID | DESC                  | DB_LOCATION_URI                                           | NAME    | OWNER_NAME | OWNER_TYPE |
+-------+-----------------------+-----------------------------------------------------------+---------+------------+------------+
|     1 | Default Hive database | hdfs://XXXXXXXXXX:9000/user/hive/warehouse            | default | public     | ROLE       |
|     6 | NULL                  | hdfs://XXXXXXXXXX:9000/user/hive/warehouse/test_db.db | test_db | hadoop     | USER       |
+-------+-----------------------+-----------------------------------------------------------+---------+------------+------------+
2 rows in set (0.00 sec)

mysql>          

文章題目:【hadoop】hive安裝實踐
地址分享:http://jinyejixie.com/article22/gpehjc.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供網(wǎng)站設(shè)計網(wǎng)站維護、網(wǎng)站制作全網(wǎng)營銷推廣、網(wǎng)站策劃關(guān)鍵詞優(yōu)化

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請盡快告知,我們將會在第一時間刪除。文章觀點不代表本網(wǎng)站立場,如需處理請聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時需注明來源: 創(chuàng)新互聯(lián)

微信小程序開發(fā)
乐东| 甘孜| 泰顺县| 威海市| 盖州市| 荥阳市| 和政县| 连南| 临海市| 克拉玛依市| 湖口县| 陕西省| 肃宁县| 尼勒克县| 梅河口市| 宁安市| 锡林浩特市| 隆回县| 庐江县| 商都县| 泗洪县| 宁河县| 柞水县| 宝山区| 阿坝| 博乐市| 东兴市| 罗山县| 正镶白旗| 青海省| 建湖县| 晴隆县| 乡宁县| 社旗县| 射洪县| 伽师县| 新余市| 福鼎市| 衡南县| 桃江县| 崇左市|