【Spark九十五】Spark Shell操作Spark SQL

编程技术  /  houtizong 发布于 3年前   82

在Spark Shell上,通过创建HiveContext可以直接进行Hive操作

 

1. 操作Hive中已存在的表

 

[hadoop@hadoop bin]$ ./spark-shellSpark assembly has been built with Hive, including Datanucleus jars on classpathWelcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 1.2.0      /_/Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)Type in expressions to have them evaluated.Type :help for more information.Spark context available as sc.////创建HiveContextscala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)sqlContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@42503a9b///切换数据库,是一个本地命令,由Hive执行,但是没有生成分布式Jobscala> sqlContext.sql("use default");res1: org.apache.spark.sql.SchemaRDD = SchemaRDD[0] at RDD at SchemaRDD.scala:108== Query Plan ==<Native command: executed by Hive>///显示当前数据库的表,是一个transformation操作,生成RDDscala> sqlContext.sql("show tables");res2: org.apache.spark.sql.SchemaRDD = SchemaRDD[2] at RDD at SchemaRDD.scala:108== Query Plan ==<Native command: executed by Hive>////获取结果scala> sqlContext.sql("show tables").collect;res3: Array[org.apache.spark.sql.Row] = Array([abc], [avro_table], [employees], [invites], [my_word], [mytable1], [parquet_table], [table1], [word], [word3], [word4], [word5], [word6])///执行切换数据库动作scala> sqlContext.sql("use default").collect;res4: Array[org.apache.spark.sql.Row] = Array()////从表中查找数据,word6是一个Hive表,是一个transformation操作scala> sqlContext.sql("select * from word6")res5: org.apache.spark.sql.SchemaRDD = SchemaRDD[8] at RDD at SchemaRDD.scala:108== Query Plan ==== Physical Plan ==HiveTableScan [id#20,word#21], (MetastoreRelation default, word6, None), None////从表中查找数据,word6是一个Hive表,执行具体的查询scala> sqlContext.sql("select * from word6").collect

 

 

 

请勿发布不友善或者负能量的内容。与人为善,比聪明更重要!

留言需要登陆哦

技术博客集 - 网站简介:
前后端技术:
后端基于Hyperf2.1框架开发,前端使用Bootstrap可视化布局系统生成

网站主要作用:
1.编程技术分享及讨论交流,内置聊天系统;
2.测试交流框架问题,比如:Hyperf、Laravel、TP、beego;
3.本站数据是基于大数据采集等爬虫技术为基础助力分享知识,如有侵权请发邮件到站长邮箱,站长会尽快处理;
4.站长邮箱:[email protected];

      订阅博客周刊 去订阅

文章归档

文章标签

友情链接

Auther ·HouTiZong
侯体宗的博客
© 2020 zongscan.com
版权所有ICP证 : 粤ICP备20027696号
PHP交流群 也可以扫右边的二维码
侯体宗的博客