How does it do it?
1. Capture network packets
Spider probes are deployed on the servers or services to capture network.
They are attached to the nodes physical or virtual network interface(s).
On Kubernetes, they are even deployed remotely from the UI!
Driven by Spider remote configuration, the probes capture raw packets transferred between systems.
To allow easier understanding, they resolve network servers names and keep tracking of them
continuously.
Filtered IP packets, associated TCP sessions and hosts are continuously streamed to Spider server for parsing.
2. Rebuild communications
The streaming data flow from all probes is parsed in on the fly to rebuild higher level communications.
Filters, templates and tags are applied to enrich communications and filters what to keep.
The data is then streamed to the search engine to be available for the UI.
3. Offer search and visual analysis tools
The UI makes all communications available for the users, with dynamic filtering and high level visual representations.
All along
During the whole process, probes availability and parsing quality is tracked continuously, making available the completeness and quality of the parsing.