Java 1.8 +Sqoop 1.4.7
本文主要是备注,最近在做这方面的工作,发现网上的文档比较少,mark下。
Maven 引用
数据库连接的Jar包
common-lang3
avro以及avro-mapred
hadoop-hdfs,hadoop-common
mapreduced 相关jar
pom
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.sqoop</groupId>
<artifactId>sqoop</artifactId>
<version>1.4.7</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.0</version>
</dependency>
<!–hadoop–>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.8.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.8.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.8.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>2.8.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-mapred</artifactId>
<version>1.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-common</artifactId>
<version>2.3.2</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.8.1</version>
</dependency>
实例代码
实例代码和网上以及官网的测试代码一致
package com.example.demo;
import org.apache.hadoop.conf.Configuration;
import org.apache.sqoop.Sqoop;
import org.apache.sqoop.hive.HiveConfig;
import org.apache.sqoop.tool.ImportTool;
import org.apache.sqoop.tool.SqoopTool;
import java.io.IOException;
public class SqoopTest {
public static void main(String[] args) throws IOException {
System.out.println(” begin test sqoop”);
String[] argument = new String[] {
“–connect”,”jdbc:mysql://localhost:3306/testsqoop?useSSL=false”,
“–username”,”root”,
“–password”,”root”,
“–table”,”data_table”,
“–hive-import”,”–hive-database”,”testsqoop”,”–hive-overwrite”,”–create-hive-table”,
“–hive-table”,”data_table”,
“–delete-target-dir”,
};
com.cloudera.sqoop.tool.SqoopTool sqoopTool=(com.cloudera.sqoop.tool.SqoopTool)SqoopTool.getTool(“import”);
Configuration conf= new Configuration();
conf.set(“fs.default.name”,”hdfs://localhost:9000″);
Configuration hive=HiveConfig.getHiveConf(conf);
Sqoop sqoop = new Sqoop(sqoopTool,SqoopTool.loadPlugins(conf) );
int res = Sqoop.runSqoop(sqoop,argument);
System.out.println(res);
System.out.println(“执行sqoop结束”);
}
}
经过本地环境测试以及Spring Boot Web服务测试 可以运行
————————————————
版权声明:本文为CSDN博主「小生丶无暇」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/CaseTime/article/details/80999097