Check out our new courses

At Luxoft Training we’re never standing still, whether we are updating our current trainings or developing new ones. So, check out our most recent additions to the portfolio.

We have increased our training offer in the area of software testing, data science, programming and software architecture. And added a new category on business intelligence with a Tableau training designed for beginners.

Test-Driven Development (TDD) in Java

Explore the basics of test-driven development. First by understanding the key principles of the approach, then practicing them in the development of a complex multi-layer application.

You will learn about unit test frameworks used in test-driven development approaches in modern programming languages. We’ll also cover some theory of unit testing with Java code samples, useful for beginner developers regardless of the methodologies used in the project.

REST API Test Automation

Intended for junior software testers and automation engineers as a quick introduction to API testing. UI test automation enables you to fully cover your application with tests in terms of black box but entails risks and costs related to interface instability, difficulty of developing such tests and their long execution times.

Tableau Desktop Basics

During our training you’ll get acquainted with Tableau as a platform, consider various options for working with the system, and the many data sources that can be connected for analysis and visualization. We will look at examples of analytical tasks from real business scenarios and solve them visually, so by the end of the course you will be able to publish your first Tableau report.

You will gain a complete understanding of how Tableau can support business processes and decisions in the organization, simplify managers’ access to data, and therefore to data-driven decisions.

Enterprise Integration

Our training will allow you to see all the main options for integrating both independent systems and applications, as well as individual subsystems within one system. Such an integration can be performed almost imperceptibly for the systems themselves.

For example, using a BPM system to call them, or by exporting data from one and importing it into another, or by describing the steps of the ETL process in the appropriate tool.

Hadoop Fundamentals

Apache Hadoop is an open-source framework used for storing and processing large datasets efficiently. It allows the clustering of multiple computers in order to allow the parallel analysis of huge datasets faster.

We’ll look at HDFS — de-facto standard for large scale long-term robust data storage, the MapReduce framework for automated distributed code execution, and companion projects from Hadoop ecosystem.

Check out our training schedule to see all the courses we’ve programmed.

Originally published at



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store