Export .zip files from an FTP server directly to HDFS












0















I have an academic project where I have an FTP server (website) containing compressed CSV files.
What I need to do is to export part of them automatically directly into HDFS










share|improve this question




















  • 1





    I would probably take a look at nifi.apache.org first. Should do what you need.

    – Binary Nerd
    Nov 20 '18 at 11:14






  • 1





    Why do you have flaged both python and r? Which language do you want to use?

    – quant
    Nov 20 '18 at 11:17











  • python thank you but f there is a solution with R i can use it too

    – Aladin Aloui
    Nov 20 '18 at 11:21






  • 1





    I haven't used R much but there seem to be some hdfs read and write libraries but I haven't been able to find any ftp libs, as binary Nerd suggested you might want to have a look at nifi I would suggest having a look at streamsets. both are pretty simple to use for this use-case. Also they can to the uncompressing to raw files which will make your life easier.

    – shainnif
    Nov 20 '18 at 12:49













  • Hadoop works fine with compressed data, depending on the format (zip files are readable). Even if you used Nifi, writing the same code in any language that connects to FTP, downloads a file, decompresses, then uploads to HDFS, the result will be the same. The Hadoop API, however is primarily Java based, so if you're trying to use Python or R, Spark is probably the most consistent solution

    – cricket_007
    Nov 20 '18 at 14:02


















0















I have an academic project where I have an FTP server (website) containing compressed CSV files.
What I need to do is to export part of them automatically directly into HDFS










share|improve this question




















  • 1





    I would probably take a look at nifi.apache.org first. Should do what you need.

    – Binary Nerd
    Nov 20 '18 at 11:14






  • 1





    Why do you have flaged both python and r? Which language do you want to use?

    – quant
    Nov 20 '18 at 11:17











  • python thank you but f there is a solution with R i can use it too

    – Aladin Aloui
    Nov 20 '18 at 11:21






  • 1





    I haven't used R much but there seem to be some hdfs read and write libraries but I haven't been able to find any ftp libs, as binary Nerd suggested you might want to have a look at nifi I would suggest having a look at streamsets. both are pretty simple to use for this use-case. Also they can to the uncompressing to raw files which will make your life easier.

    – shainnif
    Nov 20 '18 at 12:49













  • Hadoop works fine with compressed data, depending on the format (zip files are readable). Even if you used Nifi, writing the same code in any language that connects to FTP, downloads a file, decompresses, then uploads to HDFS, the result will be the same. The Hadoop API, however is primarily Java based, so if you're trying to use Python or R, Spark is probably the most consistent solution

    – cricket_007
    Nov 20 '18 at 14:02
















0












0








0








I have an academic project where I have an FTP server (website) containing compressed CSV files.
What I need to do is to export part of them automatically directly into HDFS










share|improve this question
















I have an academic project where I have an FTP server (website) containing compressed CSV files.
What I need to do is to export part of them automatically directly into HDFS







hadoop ftp hdfs






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 20 '18 at 14:01









James Westgate

8,68654558




8,68654558










asked Nov 20 '18 at 11:12









Aladin AlouiAladin Aloui

1




1








  • 1





    I would probably take a look at nifi.apache.org first. Should do what you need.

    – Binary Nerd
    Nov 20 '18 at 11:14






  • 1





    Why do you have flaged both python and r? Which language do you want to use?

    – quant
    Nov 20 '18 at 11:17











  • python thank you but f there is a solution with R i can use it too

    – Aladin Aloui
    Nov 20 '18 at 11:21






  • 1





    I haven't used R much but there seem to be some hdfs read and write libraries but I haven't been able to find any ftp libs, as binary Nerd suggested you might want to have a look at nifi I would suggest having a look at streamsets. both are pretty simple to use for this use-case. Also they can to the uncompressing to raw files which will make your life easier.

    – shainnif
    Nov 20 '18 at 12:49













  • Hadoop works fine with compressed data, depending on the format (zip files are readable). Even if you used Nifi, writing the same code in any language that connects to FTP, downloads a file, decompresses, then uploads to HDFS, the result will be the same. The Hadoop API, however is primarily Java based, so if you're trying to use Python or R, Spark is probably the most consistent solution

    – cricket_007
    Nov 20 '18 at 14:02
















  • 1





    I would probably take a look at nifi.apache.org first. Should do what you need.

    – Binary Nerd
    Nov 20 '18 at 11:14






  • 1





    Why do you have flaged both python and r? Which language do you want to use?

    – quant
    Nov 20 '18 at 11:17











  • python thank you but f there is a solution with R i can use it too

    – Aladin Aloui
    Nov 20 '18 at 11:21






  • 1





    I haven't used R much but there seem to be some hdfs read and write libraries but I haven't been able to find any ftp libs, as binary Nerd suggested you might want to have a look at nifi I would suggest having a look at streamsets. both are pretty simple to use for this use-case. Also they can to the uncompressing to raw files which will make your life easier.

    – shainnif
    Nov 20 '18 at 12:49













  • Hadoop works fine with compressed data, depending on the format (zip files are readable). Even if you used Nifi, writing the same code in any language that connects to FTP, downloads a file, decompresses, then uploads to HDFS, the result will be the same. The Hadoop API, however is primarily Java based, so if you're trying to use Python or R, Spark is probably the most consistent solution

    – cricket_007
    Nov 20 '18 at 14:02










1




1





I would probably take a look at nifi.apache.org first. Should do what you need.

– Binary Nerd
Nov 20 '18 at 11:14





I would probably take a look at nifi.apache.org first. Should do what you need.

– Binary Nerd
Nov 20 '18 at 11:14




1




1





Why do you have flaged both python and r? Which language do you want to use?

– quant
Nov 20 '18 at 11:17





Why do you have flaged both python and r? Which language do you want to use?

– quant
Nov 20 '18 at 11:17













python thank you but f there is a solution with R i can use it too

– Aladin Aloui
Nov 20 '18 at 11:21





python thank you but f there is a solution with R i can use it too

– Aladin Aloui
Nov 20 '18 at 11:21




1




1





I haven't used R much but there seem to be some hdfs read and write libraries but I haven't been able to find any ftp libs, as binary Nerd suggested you might want to have a look at nifi I would suggest having a look at streamsets. both are pretty simple to use for this use-case. Also they can to the uncompressing to raw files which will make your life easier.

– shainnif
Nov 20 '18 at 12:49







I haven't used R much but there seem to be some hdfs read and write libraries but I haven't been able to find any ftp libs, as binary Nerd suggested you might want to have a look at nifi I would suggest having a look at streamsets. both are pretty simple to use for this use-case. Also they can to the uncompressing to raw files which will make your life easier.

– shainnif
Nov 20 '18 at 12:49















Hadoop works fine with compressed data, depending on the format (zip files are readable). Even if you used Nifi, writing the same code in any language that connects to FTP, downloads a file, decompresses, then uploads to HDFS, the result will be the same. The Hadoop API, however is primarily Java based, so if you're trying to use Python or R, Spark is probably the most consistent solution

– cricket_007
Nov 20 '18 at 14:02







Hadoop works fine with compressed data, depending on the format (zip files are readable). Even if you used Nifi, writing the same code in any language that connects to FTP, downloads a file, decompresses, then uploads to HDFS, the result will be the same. The Hadoop API, however is primarily Java based, so if you're trying to use Python or R, Spark is probably the most consistent solution

– cricket_007
Nov 20 '18 at 14:02














0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53391751%2fexport-zip-files-from-an-ftp-server-directly-to-hdfs%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53391751%2fexport-zip-files-from-an-ftp-server-directly-to-hdfs%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Biblatex bibliography style without URLs when DOI exists (in Overleaf with Zotero bibliography)

ComboBox Display Member on multiple fields

Is it possible to collect Nectar points via Trainline?