Software Development and Testing at Jedox
As a framework for developing business applications with multiple front ends, Jedox offers a great degree of freedom to the consumer, allowing for more flexibility in terms of usage. In combination with the rapid speed of development, Quality Assurance at Jedox has to cope with constant changes to the software and make sure that it meets and exceeds the desired quality of functionality, stability, and performance.
Thus, the testing process at Jedox makes use of multiple approaches, from unit tests via API and integration tests to end-to-end tests, both automated and manual, executed on various platforms (Windows, Linux, 32bit, 64bit etc.). As tools for the automated testing process, both existing and self-developed software is used. The test-case base is constantly being enlarged, based on real-world experience, pragmatic test design and functionality specifications.
The Jedox In-Memory DB (together with Jedox Supervision Server) is tested on a daily basis, running hundreds of scripted tests against its HTTP API. Apache JMeter software is used for this purpose, surrounded by a self-designed framework that manages the test execution. Functionality, performance, and regression tests are executed, and results are reported in a common, self-developed Test Monitor, which is also used by other automated test frameworks.
The Jedox Integration Server is tested via a self-designed framework. During test execution, ca. 100 pre-defined Integrator Projects (both lab-designed and real-world-derived) are executed, and the job execution process, the results of the job, and the performance are monitored. Results are reported in the aforementioned Test Monitor.
Jedox Web is also tested via a self-developed framework, which at its core uses the WebDriver API to simulate real-world usage of a web browser using Jedox Web. More than 10000 test cases are executed daily on several execution environments, ranging from atomic functionality tests (for example, creating and using form elements, creation of PDF documents, or task execution) to tests of complex business applications (such as the demo applications shipped with Jedox). Again, test results are reported in a web-based test monitor. Test execution events are also reported in an automatic mailing that is sent to the development teams involved.
Jedox Excel Add-in is also tested automatically in a framework utilizing the Ranorex software tool for test execution. In addition, Excel Add-in is tested in repeated cycles before release using pre-specified test plans for groups of functionality.
On top of automated, regular test executions, specific manual tests are also executed for out-of-line releases (e.g. patch releases), based on regression and pre-specified test plans. Finally, there is also a formalized internal feedback loop, providing information on functionality and software defects from Jedox's own resources (“dogfooding”).
Updated January 29, 2024