Software Development and Testing at Jedox
Jedox serves as a platform for developing business applications with multiple front ends, providing users with significant flexibility to adapt to their business requirements.
Alongside the rapid pace of development, the Quality Assurance team at Jedox ensures that the software meets and exceeds expectations in terms of functionality, stability, security, and performance. The testing process at Jedox employs a variety of approaches, from Unit Tests via API and integration tests, and both automated and manual end-to-end tests. When applicable, tests are conducted on infrastructure in the Cloud, which is similar to that used for customer deployments. Additionally, Jedox utilizes Static Code Analysis to ensure that changes to the program code meet predefined quality standards.
We utilize both existing and custom-developed software tools for the automated testing process. The test case repository for both manual and automated tests is continuously expanded based on real-world experiences, practical test designs, and functionality specifications. Testing is performed not only for individual changes to the software code but also for packaged releases. Additionally, the change workflow requires acceptance from Product Owners after the development teams have completed their testing.
The Jedox In-Memory DB, along with the Jedox Supervision Server, undergoes daily testing that involves executing hundreds of scripted tests against its HTTP API. This testing is conducted using Apache JMeter, supported by a custom-designed framework that manages the execution of tests. The tests cover functionality, performance, security, and regression, and their results are reported to a centralized test management tool, which enhances debugging, tracking, and overall test visibility.
The Jedox Integration Server is tested using a custom-designed framework. During the testing process, over 100 pre-defined Integrator Projects, sourced from both laboratory designs and real-world scenarios, are executed. The job execution process, along with the results and performance metrics, is carefully monitored. Outcomes are reported to the centralized test management tool mentioned earlier.
Jedox Web, including Web Spreadsheets and Canvas reports, is tested using a custom-developed framework that primarily utilizes the WebDriver API. This framework simulates real-world usage of a web browser using Jedox Web. Each day, over 10,000 test cases are executed across various environments. These tests range from atomic functionality assessments (such as creating and using form elements, generating PDF documents, and executing tasks) to comprehensive evaluations of complete business applications.
For specific components within Jedox Web, such as Dynatables and Canvas Charts, we utilize the "Playwright" browser automation engine to automate testing of functionalities. Test results are then reported to the centralized test management tool mentioned earlier.
The Jedox Add-ins for Office 365 undergo both automated and manual testing to ensure quality. They are automatically tested within a framework that utilizes the "Playwright" browser automation engine. Additionally, any changes made to the Office Add-ins are manually tested against predefined acceptance criteria. Before each release of the Office Add-ins, we execute pre-specified test plans covering groups of functionalities.
In addition to automated and regular test executions, specific manual tests are conducted for out-of-line releases, such as patch releases. These tests are based on regression analysis and predefined test plans. Furthermore, there is a formal internal feedback loop that gathers information on functionality and software defects from Jedox's own resources, a practice commonly referred to as “dogfooding.”
Both the software development and quality assurance processes at Jedox are regularly audited by independent parties as part of various industry-standard certifications.
Updated September 26, 2025