R implementations
R implementations of Project Euler exercises
This package uses the following dependencies:
If you decide to run all of the pipe integration, a Docker image containing all dependencies are presented in r-testing docker hub.
This package has a website presenting all the APIs configurations and usage, followed by some examples, to build it just:
R -e "pkgdown::build_site()"
To run all the tests just run trough:
R -e "devtools::document();devtools::test()"
You need to to run devtools::document()
because of the Rcpp implementations presented in src
The site configurations are presented in the _pkgdown.yml file. As the keys to the docsearch to run the indexing presented in the search bar.
Code following the standards in presented in the .lintr file.
The Continuos Integration (CI) used in this project is supported by the following services:
Through docker environment, running the following “pipe” commands:
R -e "devtools::document()"
R -e "devtools::check()"
R -e "covr::package_coverage()"
R -e "options(warn=-1);goodpractice::gp('projectEuler')"
R -e "pkgdown::build_site()"
You can see this in the DESCRIPTION file.
MCV is a kind of Separation of Concerns, that’s it in a R environment we could translate as a “package scoping”, limiting what and how this package should behave with the connections that it make.
In this case, all of the R source file are in the Model
scope due to only handling only processing of the data. If someone wishes to connect to some application, like an Microsoft Excel usage, one must first link to it through a Controller
first, making it the binding process of the caller to it. And last, but not least, the Viewer
in this equation; the Excel must show and send all of the user requests to the others levels – in other words, handle the I/O.
Also, using this approach, you could benefit your application through the usage of tools like:
Making the most of each tool available, that’s because some of them don’t work 100% on the R code, but you can make them perform better where they can.
The usage of Docker images can help you improve the “future proofing” of your package or application, creating a sandbox that is agnostic of OS, self describing and highly performative.
Besides all of this, you can make changes in a controlled development environment before pushing it to production, testing it first and making the fine tunning where is needed.
More about it here.
Plotting must be done in the Viewer
level any time that is available, I would advice to limit to the Controller
at max. That’s is because packages like ggplot2 could affect the performance, not saying that is BAD AND YOU WILL DIE FOR USING THEM; but think that you are interacting with R through another application that could provide to you a belt of utilities and these tools are there to assist you.
And, another thing, testing a plot is more difficult than testing only numbers, data frames or even a database transaction.
MIT license presented in the LICENSE file.