I am writing an R package to support reproducible research. At this point, the workflow is mostly held together by bash scripts, and I can run an analysis by sending a single command like ./runscript.sh
. I use bash for the following:
tar
, rsync
, 'rename'ssh
R --vanilla
that in turn call R functionssed
qsub
It seems to me that it would be much more efficient (cleaner and easier) to execute the entire workflow from an R function or R script. I am partial to R since I am more familiar with it and mostly work within emacs ESS.
Would it be worthwhile to encapsulate all of these uses of bash within R using the system
and files
functions?
Are there other R packages that I have not yet found that would be helpful for doing this?
Following Al3xa's answer, I realize that it is important to note that the speed penalty of using eg. R vs bash versions of tar and gsub on 1000-2000 files would likely be less than the current rate limiting steps in the workflow: computations by JAGS (~10-20min) and FORTRAN (>4hrs)