Jump to content

Alyvix

From Wikipedia, the free encyclopedia
Alyvix
Developer(s)Violet Atom Sagl (Alan Pipitone) and Würth Phoenix Srl (Francesco Melchiori)
Initial releaseVersion 3.0 2020; 4 years ago (2020)
Stable release
3.5.0[1] / 2023-09-14
Written inPython
Operating systemMicrosoft Windows
TypeIT monitoring, Synthetic monitoring and Application performance management
LicenseGNU GPL v3
Websitealyvix.com

Alyvix is an open source[2][3] software application developed in Python for System monitoring and IT monitoring, synthetic monitoring and application performance management on Windows computers. It is used for visually monitoring fixed applications, streamed and cloud[1] applications (including encrypted ones[4]), and websites, as well as for robotic process automation.

Alyvix allows you to interact with an application's graphical user interface (GUI) to describe what should be seen onscreen after a sequence of interactions, and then later compare it whenever desired to the current GUI in that state.

Operation

[edit]

Alyvix works in two main stages: GUI description, and interactive GUI replay. In the description phase (using Alyvix Editor), Alyvix captures the screen and then allows the user to describe what to look for,[5] such as images, text labels, buttons and text fields, by drawing and annotating directly on the screen capture.

A screenshot of the editing phase when creating an Alyvix test case
A screenshot of the annotation phase when creating an Alyvix test case

The user then combines these elements with a visual scripting language Visual programming language that describes a sequence of desired interaction steps (for instance, clicking on one of the buttons, or inserting a predefined string into one of the text fields) and how those steps proceed from one to the next, along with the original series of screen grabs. This description is then saved in an open format called a test case.

Once this test case is created, Alyvix can use it to interactively replay that application interaction description as many times as you want while the application is "live". In this mode (called Alyvix Robot), Alyvix attempts to visually recognize[6] what is shown in the GUI at a particular moment using the open source OpenCV recognizer. It then cycles through the recognition and interaction phases, applying the user-defined actions in the current step to the interface it sees.

Use in monitoring

[edit]

While up to this point Alyvix can be used for automation, it also allows you to declare warning and critical thresholds that are useful for monitoring, based on visual recognition timeouts. When a timeout is exceeded, it can then be reported to a monitoring system using the Nagios and Icinga[7] protocols.

While Alyvix Robot can run a script to make a single check, what's usually needed in monitoring scenarios is to run many such checks at regular intervals, say every 5 minutes. Thus Alyvix needs to integrate with a monitoring system, which may not be open source. Coordinating this integration is Alyvix Service, which schedules test case runs over multiple target servers, manages configuration settings like how often to run each test case, records the measurements made by Alyvix Robot, and provides that data and reports via an open API. Any monitoring system, like NetEye, only needs to add a module that calls the open API as necessary.

See also

[edit]

References

[edit]
  1. ^ a b "Alyvix Stable Release 3.5.0". Alyvix. Retrieved 2023-09-14.
  2. ^ "Alyvix via Python Pip". PyPi. Retrieved 2023-06-07.
  3. ^ "SourceForge Alyvix Review". Source Forge. Retrieved 2023-10-31.
  4. ^ "End user experience monitoring for cloud applications". SFSCON. Retrieved 2024-03-11.
  5. ^ "Digital Innovation through the Lens of Alyvix". SFSCON. Retrieved 2023-12-06.
  6. ^ "Alyvix: Under the Hood". FOSDEM 2017. Retrieved 2023-12-01.
  7. ^ "System Diagnostics: A Deeper Understanding". Icinga Camp Berlin. Retrieved 2024-03-11.