+1 vote
in Hadoop by
What will a Hadoop job do if developers try to run it with an output directory that is already present?

1 Answer

0 votes
by

By default, Hadoop Job will not run if the output directory is already present, and it will give an error. This is for saving the user- if accidentally he/she runs a new job, they will end up deleting all the effort and time spent. Having said that, this does not pose us with the limitation, we can achieve our goal from other workarounds depending upon the requirements.

Related questions

0 votes
asked Oct 29, 2022 in Hadoop by SakshiSharma
+3 votes
asked Nov 25, 2022 in Hadoop by john ganales
...