Package Details: hadoop 3.3.1-1

Git Clone URL: (read-only, click to copy)
Package Base: hadoop
Description: Hadoop - MapReduce implementation and distributed filesystem
Upstream URL:
Licenses: Apache
Submitter: sjakub
Maintainer: severach (12eason)
Last Packager: severach
Votes: 80
Popularity: 0.000104
First Submitted: 2009-04-07 16:39
Last Updated: 2021-11-27 01:48

Dependencies (4)

Required by (5)

Sources (9)

Latest Comments

1 2 3 4 5 6 Next › Last »

lllf commented on 2021-05-05 01:53

ERROR: Failure while downloading

URL no longer exists.

sicalxy commented on 2021-04-22 16:42

[hadoop-conf]: EnvironmentFile for *.service doesn't work

EnvironmentFile should not use source . /etc/profile.d/ It's just a key-value text file (although it has some shell script features), as mentioned in man page [systemd.exec(5)]:

The text file should contain new-line-separated variable assignments. Empty lines, lines without an "=" separator, or lines starting with ; or # will be ignored

sicalxy commented on 2021-04-22 16:41

[hadoop-profile]: old option should be updated

- export HADOOP_SLAVES=/etc/hadoop/slaves
+ export HADOOP_WORKERS=/etc/hadoop/workers

siavoshkc commented on 2021-01-22 19:26

There seems to be a bug in PKGBUILD and consequently in /usr/bin/hadoop.

When a .sh is placed in /etc/profile.d, /etc/profile should be sourced to put .sh files in /etc/profile.d in effect.

In current hadoop 3.3.0-1, there is a loop that tries to source each .sh file in profile.d directory. Because the structure of files in profile.d are dependent on the /etc/profile script, that leads to error such as 'append_path: Command not found'.

Resolution: Change the loop which starts at line 111 and ends in 113 to: . /etc/profile

Musikolo commented on 2020-04-12 16:39

Hi @qsdrqs,

I don't know how to help you with your question about Yarn, but if you want to find the systemd services available, you can do as follows:

[musikolo@MyPc ~]$ pacman -Ql hadoop | grep 'service$'
hadoop /usr/lib/systemd/system/hadoop-datanode.service
hadoop /usr/lib/systemd/system/hadoop-jobtracker.service
hadoop /usr/lib/systemd/system/hadoop-namenode.service
hadoop /usr/lib/systemd/system/hadoop-secondarynamenode.service
hadoop /usr/lib/systemd/system/hadoop-tasktracker.service

I hope it helps.

qsdrqs commented on 2020-04-01 05:57

Hello! How can I start yarn service through this package, the script in hadoop/sbin may not recognize my config in /etc, and I can't find any systemd service on my computer to start it.

Looking forward to you reply!

takaomag commented on 2019-12-04 08:33

When I installed this package by yay, I received the following message in the terminal.

yay -S --needed --noconfirm --noprogressbar hadoop


==> Removing existing $srcdir/ directory...

==> Extracting sources...

-> Extracting hadoop-3.2.1.tar.gz with bsdtar

==> Sources are ready.

removing Untracked AUR files from cache...

:: Cleaning (1/1): /var/lib/x-aur-helper/.cache/yay/hadoop

Removing hadoop-3.2.1.tar.gz

Can not find package name : []

I did not modify the PKGBUILD. Does someone knows any solution?

dxxvi commented on 2017-06-07 05:08

How do I start this hadoop? I try:
sudo systemctl start hadoop-datanode hadoop-jobtracker hadoop-namenode hadoop-secondarynamenode hadoop-tasktracker
then check their status:
systemctl status hadoop-datanode hadoop-jobtracker hadoop-namenode hadoop-secondarynamenode hadoop-tasktracker
All of them failed. The jobtracker has this line:
Error: JAVA_HOME is not set and could not be found.
JAVA_HOME error:
Unable to start namenode and datanode: Hadoop ArchWiki to format a new distributed filesystem; for editing core-site.xml and hdfs-site.xml
jobtracker and tasktracker cannot start: running the commands in hadoop-jobtracker.service and hadoop-tasktracker.service under the hadoop account shows the reasons (12eason also mentioned that).

12eason commented on 2017-03-14 22:45

First thing, hdfs, mapred, container-executor, rcc and yarn all need to be linked to /usr/bin along with hadoop. Hdfs especially has a lot of the functions previously done by hadoop.

Secondly, the hadoop package provides shell scripts under sbin/ to start and stop instances and these would be less prone to breakage if used in the systemd scripts. As it is, many commands systemd uses are depreciated.

nmiculinic commented on 2017-03-11 17:47

There's mirror problems for hadoop:

==> Making package: hadoop 2.7.3-1 (Sat Mar 11 18:48:07 CET 2017)
==> Retrieving sources...
-> Downloading hadoop-2.7.3.tar.gz...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0
Warning: Transient problem: HTTP error Will retry in 3 seconds. 3 retries
Warning: left.