Creating a Dashboard Framework with AWS (Part 1)

[This article was first published on Quantargo Blog, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Creating a Dashboard Framework with AWS (Part 1)

R-Shiny is an excellent framework to create interactive dashboards for data scientists with no extensive web development experience. Similar technologies in other languages include the Flask, Dash or Streamlit Python frameworks. Bringing all different Dashboards under the hood including unified authentication and user management can be a challenging task. In this blog series we will show how we’ve implemented such a framework with AWS.

Use Case and Requirements

The dashboard framework was created for a research department at a major financial institution. Analysts and data scientists already had created dashboards covering different topics based on numerous technologies including R-Shiny and Python-Flask. However, a secure and unified user authentication mechanism is crucial to put the dashboards into production and restrict access only to selected users. Additionally, most analysts and data scientists do not have much dev-ops experience such as Docker containers and thus needed an easy and automated way to adapt their existing dashboards. Last but not least, the team head count was limited on the system operations side, so a simple solution with low maintenance was needed. The entire solution needed to be implemented through Amazon Web Services (AWS) as the cloud provider of choice.

Based on this situation we were asked to create a dashboard framework architecture with these requirements in mind:

  1. Secure, end-to-end encrypted (SSL, TLS) access to dashboards.
  2. Secure authentication through E-mail and Single-Sign-On (SSO).
  3. Horizontal scalability of dashboards according to usage, fail-safe.
  4. Easy adaptability by analysts through automation and continuous integration (CI/CD).
  5. Easy maintenance and extensibility for system operators.

System Architecture

All considerations above led to a simple yet effective system architecture based on selected managed AWS services including

  1. Application Load Balancer (ALB) to handle secure end-to-end (SSL) encrypted access to the dashboards based on different host names (host-based-routing).
  2. AWS Cognito for user authentication based on E-mail and SSO through Ping Federate.
  3. AWS Fargate for horizontal scalability, fail-safe operations and easy maintenance.
  4. AWS Codepipeline and Codebuild for automated build of dashboard Docker containers.
  5. Extensive usage of managed services requiring low maintenance (Fargate, Cognito, ALB) and Amazon Cloud Development Kit (CDK) to define and manage infrastructure-as-code managed in Git and deployed via Code Pipelines.

The figure below illustrates the resulting architecture in more detail:

1. Application Load Balancer

A central piece of the system architecture is the Application Load Balancer (ALB) to route traffic securely to each dashboard. We configured the ALB with host-based routing, so that requests to e.g. https://dashboard1.domain.com or https://dashboard2.domain.com are routed to the respective dashboards. The ALB handles SSL-offloading so that all communication between clients and the load balancer is end-to-end SSL or TLS encrypted. Additionally, we use a feature of ALB to authenticate users through a OIDC compliant identity provider, such as Amazon Cognito. Thus, all users without an authentication token are redirected to a login page, as provided by a Cognito Hosted UI. After successful authentication users are allowed to access the respective dashboard of choice.

2. AWS Cognito for User Authentication

We used Cognito as a managed identity provider by AWS supporting all important authentication mechanisms like e-mail/password (plus MFA) and federation providers like Google, Facebook or Apple. Most importantly, Cognito also supports SAML providers like Ping Federate for SSO within large corporations. The login form is also hosted by Cognito and presented to users who have not yet logged into any dashboard:

3. AWS Fargate

All Dashboards are running within Docker containers and hosted as Fargate Tasks within a common cluster. This makes it possible to create dashboards independently from each other including different versions of R (or Python), packages and even operating systems. The pricing of Fargate tasks is comparable to EC2 depending on the CPU/Memory configuration but comes with the advantage of being completely managed. This also makes auto-scaling a breeze which adds new tasks depending on the current workload.

4. Code Pipeline

Time to market is essential in many industries to get changes and features as fast as possible to the customer. Additionally, many dashboard developers do not want to be occupied with dev-ops tasks like Docker containers and bash scripting. By using Code Pipeline we made sure that dashboard developers only needed to push changes to the repository—the pipeline, builds the docker container, pushes it to elasitic container registry (ECR) and subsequently deploys the new container to the cluster using Code Deploy. The deployment ensures that users have a seamless experience by redirecting new sessions to new instances and dropping old instances once no more open sessions are left.

5. CDK and CI/CD

The AWS Cloud Development Kit (CDK) was a very important tool to quickly setup the entire stack including infrastructure components, build pipelines, and even domain entries. Typescript was our language of choice since 1. It provides the best support by the community (followed by Python) and 2. CDK is also written in Typescript which makes debugging much easier. Since CDK code gets synthesized to common AWS Cloud Formation Templates through the command cdk synth developers get immediate feedback if something went wrong and can shorten the feedback cycle. Through cdk deploy the template can be uploaded and deployed as Cloud Formation Stacks. Thanks to the infrastructure as code principle it is very easy to track changes in Git version control and upload stacks to multiple accounts for development/staging/production.

Conclusion

We could give an overview of the system architecture to deploy a simple yet powerful dashboard framework within Amazon Web Services. The presented framework fulfilled all security requirements and is requires low maintenance efforts thanks to many integrated managed AWS services. In the next post we will show how such a framework can be built from scratch using CDK including dashboard templates for R-Shiny, Flask, Dash and Streamlit.

Stay tuned! ✌️

Get in Touch

Interested in creating your own dashboard framework or other data science cloud stacks? Just get in touch:

E-Mail: [email protected] Contact Form: Link

To leave a comment for the author, please follow the link and comment on their blog: Quantargo Blog.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)