Skip to main content

156 posts tagged with "features"

View All Tags

Reworked Template feature

· 4 min read

Previous state of URL templates​

Since many months, it was possible to defined URL templates in user settings to perform URL matching in filtering and stats:

Config​

Grid filtering​

Stats​

It worked, but:

  • It was a pain to configure - a regular expression for each URL
  • It was not very explicit - the regular expression as label
  • It was not efficient - the regulare expression were used as filters in Elasticsearch queries, which made Elastic run ALL regular expression on all URIs everytime the filter was selected... :(
  • Templates were associated to a user but not a Whisperer or a team

I then did a major rework of the feature to be more efficient, user friendly and... to allow more features to be built on it :)

The new Request Templates!​

First, you may notice from the name that it is not any more URL templates, but Request templates. Why? Because not only the URL can be analysed, but also the HTTP method, the headers and the body.

You define for each template:

  • A name
  • The fields to parse
  • A regular expression

And Spider associates the HTTP communication to the first matching template while Parsing.

Pros: much faster, flexible, easy to use and extensible. Cons: one time processing only. You cannot change already parsed and templated communications. Testing patterns is complicated, for now.

Look.

Configuration​

The Request templates are part of Whisperer parsing configuration.

  • The regular expression is matched over a combinaison of selected request parts.
  • The template name of the first matching template is saved in req.template field of the HTTP communication
    • The name can be plain text or include values extracted from the match
    • The regular expression is used as a substitution pattern over the name

The regular expressions above are enough to match all (or I believe so) endpoints exposed by Flowbird Streetsmart.

Ex:

For Streetsmart, the configuration is maintained in this Google Sheet. And then I just copy paste to Spider form. Gorgeous :)

Data filtering​

The templates can be filtered in the grid:

Or in the menu:

Detail view​

The template associated to the HTTP communication is visible in the detail view, that encountered a small reorder of fields:

As with any other field, you may add or remove the template from the filter with the small glass nearby.

Stats​

Templates are available as a grouping field in HTTP stats. And they make stats much more readable than previous version:

What's the use of selecting Headers of Body in template configuration?​

In some cases, like XML or JSONRPC, the URI is not enough to 'name' the request. Then, parsing the headers or the body becomes useful.

Example with PRM software configuration:

Which makes PRM exchanges much more readable compared to the URI only :) !!

However, this is to use with caution:

  • Parsing the headers introduce more memory usage on the server, but this is ok
  • Parsing the body implies to decode the body (manipulated as a buffer otherwise) and to uncompress it when it is encoded in Gzip, Deflate or Brotli.
    • As Spider is processing around 400 communications /s right now on Streetsmart, that could mean 400 unzipping every second if we have to analyse the body.
    • So it must be limited to certain conditions.

I'm thinking of splitting body from the other inputs, to limit body parsing to certain URIs... but I have issue on how to make it still easy to configure.

We can even make stats on PRM:​

However... it is difficult to split success and errors when an API always sends 200.

That's when you need Tags in place! They come next!

What's coming next?​

Request templates are allowing us to:

  • Derive the feature to offer a 'tagging' feature (next blog item in row)
  • Enrich the network map display and tooltips (in coming weeks)
  • Implement a regular croned request on production to get regular automated stats

Deprecation​

Existing URL template feature is deprecated and is going to be removed soon. Should you have issues with this, please tell me.

Tagging HTTP communications

· 2 min read

When designing Request Templates, I figured out that the system could be extended to extract Tags from the communications.

With a similar configuration than for templates, you may extract values from the request or the response and add it in indexed fields in the HTTP communication resource.

Then these fields will be available in Grid, in filtering, in the HTTP communication detailed view, and in the Stats grouping.

Configuration​

  • The tag name is used as the field name when storing:
    • req.tags.[name]
    • res.tags.[name]
  • If the pattern matches, the tag is associated to:
    • The extracted values from the capture groups of the regular expression
    • true value if no group is present

Grid​

Tags can be displayed in dedicated columns. The columns are listed in the column list based on the configuration of selected Whisperers.

Detailed view​

Tags are listed in HTTP communication first tab:

Tags field can be use in searches as any others:

And drop down filtering is available next to column headers:

Stats​

Tags can be selected for grouping statistics:

To continue​

PRM​

All below examples come from tags extracted from PRM communications. The configuration is available on PRM Integration whisperer.

This configuration should be completed with Error tags for instance : tag the response as an error when the body says so.

Streetsmart​

  • We may test extraction of park value from response to give statistics by park.
  • Other ideas are welcome

However, we must be careful. Because of the cost of body decompression.

Embedding raw headers in HTTP communications

· One min read

New feature added on request by Flowbird integration team:

It is now possible to save raw headers inside the HTTP communication resource.

  • This is interesting when the packets are not saved.
  • Filtering on headers is not applied on raw headers
  • Raw headers are filtered out (like content) when searching HttpCom resources by default, but they can be included with "withContent": true parameter in search body
  • Raw headers are automatically included when exporting communications

Effect​

The view source link in Headers tab is then working even if packets have not been saved.

Configuration​

On the Parsing Config tab of Whisperer settings.

You may also specify URIs for which you want NOT to save rawHeaders (in case of privacy concerns for instance)

Resource change example​

In req and res field:

"rawHeaders": {
"content": "GET /v1/posEventStatuses?modulo=1&remainder=0&view=content&afterUpdateMarker%5B%5D=85597012&afterUpdateMarker%5B%5D=38696&range=100 HTTP/1.1\\r\\nAuthorization: JWT eyJhbGci...\\r\\nHost: itproduction\_gateway\\r\\nX-Forwarded-Host: itproduction\_gateway\\r\\nX-Forwarded-For: 10.0.0.129\\r\\nX-Forwarded-BasePath: /terminal/devicemonitoring\\r\\nConnection: close\\r\\nCorrelation-Token: 19IRIKnRm68N0Gl1x6AqtyLtNNa-ztCI",
"size": 2990
}

Expect 100 continue

· One min read

As Whoosh is sending Tickets to PRM using HTTP Expect 100 continue pattern, I added it to Spider.

What is it?

This HTTP protocol pattern allow a client to ask the server if he is allowed to send a request body BEFORE sending it:

  • The client adds the header: expect: 100-continue to the request without sending the body
  • If the server answers 100 Continue, then the client can resume
  • And then the client can send the body and receive the 'real' response

Example:

Now, Spider can parse this communication pattern. The result is two communications:

  • First one with the request headers, and the 100 continue answer.
  • Second one with the request headers and body and the final server answer.

Example:

Whisperers upgrade and versions

· One min read

Whisperers have got a technical upgrade:

  • Upgraded to Node 10 and async await programming pattern
  • Upgraded to latest Node pcap library

To follow Whisperers migrations, I added a Whisperer version and it is sent with the status to the server. The monitoring now shows this version in the Current Status grid:

Moreover, this grid has been improved to show:

  • In orange the whisperers that have not communicated since 2 minutes
  • In dark red, those that have not communicated for more than 1 day

Timeline component OpenSourced

· One min read

The Timeline is one of the component that took me most brain hours, with the map and the grid.

To enable Flowbird's Streetsmart project to use it, I decided to opensource it. This is my first opensource project... We'll see how it goes ;)

Repository​

The component is available since a couple of months, and has been integrated successfully in Streetsmart monitoring UI. It embeds a small test application packaged as a Docker that allows testing, and even development with hot reloading.

Changes​

Since its packaging as an opensource and standalone component, the Timeline has evolved (Changelog) thanks to Streetsmart UI team ideas, and all changes are accessible both on Spider and Streetsmart.

  • React 16 refactor
  • Lots of options to control the behavior
  • Left margin adapt to legend length
  • Small icon displayed when cursor is out of view

  • Autosliding with regular fetching when dragging cursor left of right of the area
  • Visual improvements and bug fixes

Saving user settings to server

· 2 min read

Since a long time, user settings were saved in local storage, but if you opened a new session on another computer, you had to set all as you wanted again.

Now, settings are saved on the server. Settings are split in 2: users own settings and global settings:

User settings

  • Grid settings
  • Timeline settings
  • Refresh delays
  • Workspace composition (panels opened and width)
  • Selected view, mode and options
  • Usage stats and errors sending to server
  • ...

Global settings

  • Url templates
  • Merge patterns for replicas and clients
  • Correlation tokens
  • Stats configurations
  • Saved queries
  • ...

Features

  • Settings are stored after a change on one of them, but with a delay of x seconds. As settings are often not changed alone, it reduces server calls
    • Then user has to wait a bit before settings is saved
    • The settings icon is shown with white fill when a save is waiting
    • The settings icon is rotated on saving

  • Date of last saving is displayed in settings panel

  • List of saved settings is available in source tab of settings panel
  • Defaults
    • Saving settings is enabled by default on start (cannot be changed)
    • Saving settings can be disabled in settings panel
    • Saving settings are disabled by default when loading an external link (not to override user's settings)
  • Global settings will be able to be overridden by team settings on coming months

Cloning an existing Whisperer configuration

· One min read

Very neat to avoid mistake or save time when setting up a new Whisperer:

  • Now, when editing a Whisperer configuration, you may clone the configuration values of an existing Whisperer to fasten your edition.
    • The edition is still in progress, you may accept or cancel the changes.
  • This is possible for all 3 current configurations tabs:
    • Capture config
    • Parsing config
    • Share
  • The list of Whisperers to clone is taken from the filtered list in the whisperers selection input of the menu. Thus, you can use Elasticsearch power to filter the ones you want to clone.
    • Easy and neat. Just have to know it ;)

 

I don't know why it took my so long to start implementing this, I overuse it already :) So neat !!

Saved queries

· One min read

It is now possible to save existing queries  and give them a name, to reload them afterwards.

Note: The query only includes the generated filter in the search input, no information is saved on the selected time, the current view or the filters input.

Saving a query​

  • To save a query, click on the save icon

  • In input is show to give it a name
    • If the query has already been saved (and is not modified), the name of the current query is shown
    • Changing the name will update / replace the saved query name

  • You may give the same name to different query. It is up to you to remove old ones.
  • The saved queries are linked to the selected View (HTTP, TCP...)

Loading a saved query​

  • Once saved, a query can be reloaded by clicking the load icon

  • The list of saved query shows up with alphabetical sorting

  • The search input is filled with the selected query and search is executed

Select filter improvement

· One min read

Filters selection dropdown has been improved.

  • Previously, the select applied the current query to show you the available options.
    • If you had previously selected a value, no other value could be displayed, as the query was filtering them out;
  • Now, the select gathers the available options by running the current query, but without applying the current filter selection. Thus allowing you to add more values in the selected options of this filter.

This is much more useful than showing a blank select dropdown ;-)