In this article, we will answer a few popular questions about the Scala* Prerequisite Assessment - the first major challenge on your path to becoming a Quantexa Certified Data Engineer or Scoring Engineer.
You can learn more about these two certification programs in the article Introduction to the Quantexa Academy.
1. Why is this Assessment necessary?
The Quantexa Data Engineer / Scoring Engineer Academies are challenging.
Therefore, there is a certain level of Scala & Spark proficiency required for you to be able to successfully complete these programs.
This assessment is designed to help us decide if your Scala & Spark skills are strong enough for you to be able to successfully complete these Academies.
This assessment is also a great opportunity to receive targeted feedback on your coding approaches and skills and their compliance with best practices.
You need to pass this Assessment in order to be admitted into the Quantexa Data Engineer / Scoring Engineer Academy.
2. How does the Assessment work?
Once onboarded onto our Learning Management System, you will receive a zip file containing three exercises and a Scala & Spark project with the provided code for you to download.
- You will be asked to submit your work - as a zip file - via the LearnUpon platform.
- Submit your code for all three exercises together, please.
- Once your answers have been submitted, the code will be reviewed, and you will be provided with feedback regarding whether you have passed the Assessment or not.
- You may be asked to revisit certain questions and provide an improved answer based on the Instructor's feedback.
3. What are the Assessment criteria?
We will be reviewing the style of your code to check if it meets best practices and is written in a purely functional coding style.
💡Before submitting your assessment, please ask yourself these three questions:
- Does my code follow best practice Scala Coding Guidelines (see below)?
- Have I fully answered the questions asked of me?
- Do my final datasets match up with those provided?
4. Scala Coding Guidelines
💡These are the questions you need to keep asking yourself when you write your code.
- Am I using case classes or large nested tuples?
- Is my code easily readable, self-documenting with clearly named variables?
- Does my code run without error?
- Is my code flexible (e.g. no hard coded paths)?
- Is my code written in a functional style and have I only used immutable objects (e.g. vals not vars)?
- Would my code still work for large data volumes?
5. Can I complete the Assessment in collaboration with my colleagues?
🚩 No! This is an individual Assessment.
If we identify that several submissions look identical, we will be conducting a viva (oral assessment).
We will ask you to talk us through your solutions and to demonstrate your understanding of the code you have produced.
6. Can I complete the Assessment using ChatGPT or other AI tools?
🚩 No! Please, don't do it.
The aim of the Assessment is to verify whether you have the necessary skills to independently handle Scala coding challenges.
Therefore, we discourage you from using ChatGPT or other AI tools to analyze and provide solutions to the challenges presented in the three exercises.
The same applies to other tasks you will be expected to complete during your learning experience.
7. How long does it take to complete the Assessment?
The Scala & Spark Prerequisite Assessment can take up to several weeks to complete depending on the candidate's previous exposure to Object-orientated, functional and Spark programming.
You should expect to receive feedback within 48 hours of the submission.
8. What if my Scala or Spark is basic or if I don't know it?
If you are new to Scala or Spark and functional programming, we recommend the following materials.
While we believe these are solid courses, we cannot guarantee that completing any of these will result in passing the Assessment.
Highly recommended course by the designer of the language, Martin Odersky, at École Polytechnique Fédérale de Lausanne (EPFL). Focus on the functional paradigm.
Focus on the functions in the Scala Library, presented in multiple languages.
Useful quick reference of Scala syntax, with examples of constructs provided.
Part of the Functional Programming in Scala specialization recommended by EPFL.
Covers Spark Foundations and Architecture, processing Data Frames, Data Sets and Spark SQL.
Introduces relevant Scala then dives into Spark API, covers RDD, Data Frames, Data Sets and ML and GraphX.
*Scala is a trade name of Lightbend Inc.
Did you know that you can log in (or sign up) to the Community to unlock further resources in the Community and on our Documentation site?