0

I am tryingto use sqoop to import a table from mysql to HDFS..It throws java.io.IOException error Destination folder cannot be created

    [root@01HW288075 hadoop]# sudo -u hdfs sqoop import --username user --password pass --connect jdbc:mysql://172.16.176.109/pocdb --table stocks --verbose
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/07/30 09:40:24 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.7.0
14/07/30 09:40:24 DEBUG tool.BaseSqoopTool: Enabled debug logging.
14/07/30 09:40:24 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/07/30 09:40:24 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
14/07/30 09:40:24 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
14/07/30 09:40:24 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:mysql:
14/07/30 09:40:24 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
14/07/30 09:40:24 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.MySQLManager@1385660
14/07/30 09:40:24 INFO tool.CodeGenTool: Beginning code generation
14/07/30 09:40:24 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM `stocks` AS t LIMIT 1
14/07/30 09:40:24 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
14/07/30 09:40:25 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
14/07/30 09:40:25 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `stocks` AS t LIMIT 1
14/07/30 09:40:25 DEBUG orm.ClassWriter: selected columns:
14/07/30 09:40:25 DEBUG orm.ClassWriter:   id
14/07/30 09:40:25 DEBUG orm.ClassWriter:   symbol
14/07/30 09:40:25 DEBUG orm.ClassWriter:   quote_date
14/07/30 09:40:25 DEBUG orm.ClassWriter:   open_price
14/07/30 09:40:25 DEBUG orm.ClassWriter:   high_price
14/07/30 09:40:25 DEBUG orm.ClassWriter:   low_price
14/07/30 09:40:25 DEBUG orm.ClassWriter:   close_price
14/07/30 09:40:25 DEBUG orm.ClassWriter:   volume
14/07/30 09:40:25 DEBUG orm.ClassWriter:   adj_close_price
14/07/30 09:40:25 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
14/07/30 09:40:25 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `stocks` AS t LIMIT 1
14/07/30 09:40:25 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.java
14/07/30 09:40:25 DEBUG orm.ClassWriter: Table name: stocks
14/07/30 09:40:25 DEBUG orm.ClassWriter: Columns: id:4, symbol:12, quote_date:12, open_price:8, high_price:8, low_price:8, close_price:8, volume:4, adj_close_price:8, 
14/07/30 09:40:25 DEBUG orm.ClassWriter: sourceFilename is stocks.java
14/07/30 09:40:25 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/
14/07/30 09:40:25 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
14/07/30 09:40:25 DEBUG orm.CompilationManager: Returning jar file path /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.0.0-cdh4.7.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar
14/07/30 09:40:25 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.java
14/07/30 09:40:25 DEBUG orm.CompilationManager: Invoking javac with args:
14/07/30 09:40:25 DEBUG orm.CompilationManager:   -sourcepath
14/07/30 09:40:25 DEBUG orm.CompilationManager:   /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/
14/07/30 09:40:25 DEBUG orm.CompilationManager:   -d
14/07/30 09:40:25 DEBUG orm.CompilationManager:   /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/
14/07/30 09:40:25 DEBUG orm.CompilationManager:   -classpath
Note: /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/07/30 09:40:26 ERROR orm.CompilationManager: Could not make directory: /home/root1/hadoop/.
14/07/30 09:40:26 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.java to /home/root1/hadoop/./stocks.java
java.io.IOException: Destination '/home/root1/hadoop/.' directory cannot be created
    at org.apache.commons.io.FileUtils.copyFile(FileUtils.java:882)
    at org.apache.commons.io.FileUtils.copyFile(FileUtils.java:835)
    at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2385)
    at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:239)
    at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:97)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:506)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:240)
14/07/30 09:40:26 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.jar
14/07/30 09:40:26 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d
14/07/30 09:40:26 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.class -> stocks.class
14/07/30 09:40:26 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-hdfs/compile/7fcf81959d64c2f761e51076be612e9d/stocks.jar
14/07/30 09:40:26 WARN manager.MySQLManager: It looks like you are importing from mysql.
14/07/30 09:40:26 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
14/07/30 09:40:26 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
14/07/30 09:40:26 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
14/07/30 09:40:26 DEBUG manager.MySQLManager: Rewriting connect string to jdbc:mysql://172.16.176.109/pocdb?zeroDateTimeBehavior=convertToNull
14/07/30 09:40:26 DEBUG manager.CatalogQueryManager: Retrieving primary key for table 'stocks' with query SELECT column_name FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = (SELECT SCHEMA()) AND TABLE_NAME = 'stocks' AND COLUMN_KEY = 'PRI'
14/07/30 09:40:26 DEBUG manager.CatalogQueryManager: Retrieving primary key for table 'stocks' with query SELECT column_name FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = (SELECT SCHEMA()) AND TABLE_NAME = 'stocks' AND COLUMN_KEY = 'PRI'
14/07/30 09:40:26 INFO mapreduce.ImportJobBase: Beginning import of stocks
14/07/30 09:40:26 WARN conf.Configuration: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
14/07/30 09:40:27 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/07/30 09:40:27 DEBUG db.DBConfiguration: Securing password into job credentials store
14/07/30 09:40:27 DEBUG mapreduce.DataDrivenImportJob: Using table class: stocks
14/07/30 09:40:27 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat
14/07/30 09:40:28 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/sqoop-1.4.3-cdh4.7.0.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.13-bin.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/sqoop-1.4.3-cdh4.7.0.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/sqoop-1.4.3-cdh4.7.0.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-core-asl-1.8.8.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/netty-3.4.0.Final.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/paranamer-2.3.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/avro-mapred-1.7.4-hadoop2.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/servlet-api-2.5-20081211.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/avro-ipc-1.7.4-tests.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/avro-1.7.4.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jetty-util-6.1.26.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/snappy-java-1.0.4.1.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-io-1.4.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.13-bin.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-compress-1.4.1.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.8.8.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/xz-1.0.jar
14/07/30 09:40:28 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/hsqldb-1.8.0.10.jar
14/07/30 09:40:28 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited.
14/07/30 09:40:28 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is started.
14/07/30 09:40:29 DEBUG db.DBConfiguration: Fetching password from job credentials store
14/07/30 09:40:29 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `stocks`
14/07/30 09:40:29 DEBUG db.IntegerSplitter: Splits: [                           1 to                           45] into 4 parts
14/07/30 09:40:29 DEBUG db.IntegerSplitter:                            1
14/07/30 09:40:29 DEBUG db.IntegerSplitter:                           12
14/07/30 09:40:29 DEBUG db.IntegerSplitter:                           23
14/07/30 09:40:29 DEBUG db.IntegerSplitter:                           34
14/07/30 09:40:29 DEBUG db.IntegerSplitter:                           45
14/07/30 09:40:29 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 1' and upper bound '`id` < 12'
14/07/30 09:40:29 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 12' and upper bound '`id` < 23'
14/07/30 09:40:29 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 23' and upper bound '`id` < 34'
14/07/30 09:40:29 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 34' and upper bound '`id` <= 45'
14/07/30 09:40:29 INFO mapreduce.JobSubmitter: number of splits:4
14/07/30 09:40:29 WARN conf.Configuration: mapred.job.classpath.files is deprecated. Instead, use mapreduce.job.classpath.files
14/07/30 09:40:29 WARN conf.Configuration: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
14/07/30 09:40:29 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
14/07/30 09:40:29 WARN conf.Configuration: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
14/07/30 09:40:29 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
14/07/30 09:40:29 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name
14/07/30 09:40:29 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
14/07/30 09:40:29 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
14/07/30 09:40:29 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
14/07/30 09:40:29 WARN conf.Configuration: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
14/07/30 09:40:29 WARN conf.Configuration: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
14/07/30 09:40:29 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
14/07/30 09:40:29 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1406367087547_0013
14/07/30 09:40:30 INFO client.YarnClientImpl: Submitted application application_1406367087547_0013 to ResourceManager at /0.0.0.0:8032
14/07/30 09:40:30 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1406367087547_0013/
14/07/30 09:40:30 INFO mapreduce.Job: Running job: job_1406367087547_0013

I have the connectivity to MySQl.I am able to list-the-Databases and list-the-tables in sqoop So connectivity problem is not there.. If I go to the URL suggested http://localhost:8088/proxy/application_1406367087547_0013/ I get the message " The requested application does not appear to be running yet, and has not set a tracking URL." Kindly help me in resolving this issue.

1
  • I restarted all the hdfs and yarn service.My hadoop-yarn-nodemanager was not started... I started it as well... and then it works as a charm. Commented Jul 31, 2014 at 4:59

3 Answers 3

1

Create directory /home/root1/hadoop/ and make it world writable.

sudo mkdir -p /home/root1/hadoop
sudo chmod +w /home/root1/hadoop 

Start all this again and look for the results. And sudo is needed only if you are not root ($ sign on the terminal)

Sign up to request clarification or add additional context in comments.

1 Comment

Tried it..Thanks. Now the destination folder cannot be created error is cleared.But still it shows running job...and then stops at that point...NO errors in the log console whatsoever.When I do a jps.. the sqoop jobs are running..
0

I can not remember, but maybe I had similar problem when I did not specified partition column name. So try following command:

sqoop import --connect jdbc:mysql://172.16.176.109:3306/pocdb --username user --password pass --table stocks --target-dir /tmp/target -m 1

Hope it helps

Comments

0

From the logging messages, the last part of the import directory /home/root1/hadoop/. is a dot. This will definitely fail. I think it is a bug of Sqoop. I tried Sqoop 1.4.6, import directory is never a dot. If you can share your Hadoop and Sqoop version info, it might be helpful.

What you can do is to provide a import path explicitly --target-dir /user/your_name/sqoopimport. But you should make sure that you have write permission to this location.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.