Error while trying to use --boundary-query argument in sqoop import
up vote
0
down vote
favorite
I am learning sqoop on my own and tried to run the below mentioned code to retrieve the first 3000 records from the database and evenly split by primary key emp_no
sqoop import
--connect jdbc:mysql://localhost/employees
--username root
-P
--query 'select * from employees WHERE $CONDITIONS ORDER BY emp_no LIMIT 3000'
--split-by emp_no
-m 3
--target-dir sqoop/import_data/employee_db_import
--delete-target-dir
The above statements yielded evenly distributed results 1000 records per mapper.
Now for further learning I added the --boundary-query argument as
--boundary-query 'select MIN(emp_no),MAX(emp_no) from employees'
to the above statement and the map reduce job is now reading 9000 records from the database. I want to know why is this happening ?
mysql hadoop mapreduce sqoop
add a comment |
up vote
0
down vote
favorite
I am learning sqoop on my own and tried to run the below mentioned code to retrieve the first 3000 records from the database and evenly split by primary key emp_no
sqoop import
--connect jdbc:mysql://localhost/employees
--username root
-P
--query 'select * from employees WHERE $CONDITIONS ORDER BY emp_no LIMIT 3000'
--split-by emp_no
-m 3
--target-dir sqoop/import_data/employee_db_import
--delete-target-dir
The above statements yielded evenly distributed results 1000 records per mapper.
Now for further learning I added the --boundary-query argument as
--boundary-query 'select MIN(emp_no),MAX(emp_no) from employees'
to the above statement and the map reduce job is now reading 9000 records from the database. I want to know why is this happening ?
mysql hadoop mapreduce sqoop
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am learning sqoop on my own and tried to run the below mentioned code to retrieve the first 3000 records from the database and evenly split by primary key emp_no
sqoop import
--connect jdbc:mysql://localhost/employees
--username root
-P
--query 'select * from employees WHERE $CONDITIONS ORDER BY emp_no LIMIT 3000'
--split-by emp_no
-m 3
--target-dir sqoop/import_data/employee_db_import
--delete-target-dir
The above statements yielded evenly distributed results 1000 records per mapper.
Now for further learning I added the --boundary-query argument as
--boundary-query 'select MIN(emp_no),MAX(emp_no) from employees'
to the above statement and the map reduce job is now reading 9000 records from the database. I want to know why is this happening ?
mysql hadoop mapreduce sqoop
I am learning sqoop on my own and tried to run the below mentioned code to retrieve the first 3000 records from the database and evenly split by primary key emp_no
sqoop import
--connect jdbc:mysql://localhost/employees
--username root
-P
--query 'select * from employees WHERE $CONDITIONS ORDER BY emp_no LIMIT 3000'
--split-by emp_no
-m 3
--target-dir sqoop/import_data/employee_db_import
--delete-target-dir
The above statements yielded evenly distributed results 1000 records per mapper.
Now for further learning I added the --boundary-query argument as
--boundary-query 'select MIN(emp_no),MAX(emp_no) from employees'
to the above statement and the map reduce job is now reading 9000 records from the database. I want to know why is this happening ?
mysql hadoop mapreduce sqoop
mysql hadoop mapreduce sqoop
asked yesterday
Sarvagya Dubey
699
699
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53265727%2ferror-while-trying-to-use-boundary-query-argument-in-sqoop-import%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password