RSS Feed
News
Dec
30
How Fast is SciChart’s WPF Chart? DirectX vs. Software Comparison
Posted by Andrew on 30 December 2014 07:25 PM

// TLDR;

Test results demonstrate that SciChart’s Direct3D10RenderSurface, available in the SciChart Enterprise and SDK Editions, is up to 18x faster than software rendering, and median speed increase is 4x faster than software. In a variety of test results and chart types SciChart’s DirectX Renderer plugin excelled in performance vs. software!

// Measuring WPF Chart Performance

WPF Chart Performance is really important to us at SciChart. We strive to have the best, and the fastest WPF Chart components in the world, but how do we demonstrate what we’ve achieved vs. our competitors to would-be customers?

We’ve created an application which measures our WPF Chart Performance in a number of scenarios, such as line, scatter, scrolling-line and many-series scenarios. The tests are designed to really stress the chart control and find it’s limits.

There there are several factors which influence overall performance of a WPF Chart Control. Such as:

  • Number of Series in the chart
  • Types of Series (e.g. Line, Scatter, Area, Candlestick)
  • Number of data-points per series
  • Rate of change of data (number of points appended or removed per second)
  • Thickness of pens / number of pixels filled
  • Size of the chart viewport on screen
  • Number of transformations between data & pixel coordinates per second
  • Any additional calculations like Logarithmic Axis scaling or Auto Ranging

These are reflected in the test cases and show stress on different areas of the renderer.

// The Test Setup

Test Setup Configuration

As you may know, SciChart ships with several RenderSurface plugins including the HighSpeedRenderSurface, HighQualityRenderSurface, and now, the Direct3D10RenderSurface, a DirectX10/11 harware-accelerated RenderSurface implementation.

We’ve tested the latest release of SciChart: v4, and compared the three renderer plugins across a number of different scenarios in order to highlight the difference in performance of the SciChart RenderSurface implementations.

What Tests are Run

A number of tests are run in our WPF Chart Performance Comparison app which stress different areas of the renderer. These tests are designed to really stress the chart, by having huge numbers of series, or points, or many updates per second.

A certain test is run, and FPS, or refresh rate, is measured via the SciChartSurface.Rendered event, which fires once after each drawing pass completes.

The test cases are as follows:

Test 1: NxM Series Test

SciChart Performance Test 1 - NxM Series

SciChart Performance Test1: NxM Series

N series of M points are appended to an XyDataSeries, then the chart redraws (the same data) as fast as possible for 20 seconds per test. FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Iterating Data Series, Coordinate Transformation and Drawing.

NOTE: Resampling is not applicable for short data-series (below a few thousand points) so this tests iterating over many data-series, coordinate transformation as well as raw drawing speed.

Test 2: Scatter Series Test

SciChart Performance Test 2 - Scatter Series

SciChart Performance Test2: Scatter Series

N Scatter points are appended to a XyDataSeries, then the chart redraws. Immediately after, the points are updated in a Brownian motion fashion and the chart is drawn again. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Coordinate Transformation, Geometry Generation (Ellipse) and Drawing.

NOTE: Resampling is not applicable for Scatter Series as our standard resampling algorithms do not work with scatter data, so this test effectively tests the geometry generation and drawing speed.

Test 3: FIFO Series Test

SciChart Performance Test 3 - Fifo Series

SciChart Performance Test3: Fifo Series

N points are appended to a FIFO (circular buffer) series, then a single point is appended (and one dropped), which triggers a redraw of the chart. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Copying Circular Buffers (FIFO Series), Resampling and Drawing.

NOTE: Because the data is random (0-1) lots of pixels are drawn. A heavier emphasis on drawing occurs in this test than resampling, although at higher point counts resampling starts to kick in

Test 4: Append Data Test

SciChart Performance Test 4 - Append (NoiseFactor=0)

SciChart Performance Test4: Append (NoiseFactor=0)

SciChart Performance Test 4 - Append (NoiseFactor=100)

SciChart Performance Test4: Append (NoiseFactor=100)

SciChart Performance Test 4 - Append (NoiseFactor=1000)

SciChart Performance Test4: Append (NoiseFactor=1000)

N/3 points are appended to 3 DataSeries, then M points are appended between each draw of the chart. The data is random-walk but we vary noise to create more, or less noisy waves. This has the effect of stressing drawing more (when more noisy) vs. resampling more (when less noisy).

Areas stressed: Appending Data, Resampling, Auto-Ranging and Drawing.

NOTE: As the noise factor is increased, more pixels must be drawn, which stresses the rendering (drawing) more. For lower noise, resampling is the dominant influence on performance.

Test Setup Hardware

  • For the Test Setup we have used a desktop workstation with
    • Intel i7-5820K 3.3GHz
    • 32GB DDR3-2400 RAM
    • 2GB NVidia GTX760 video card
    • Windows 8.1 Professional
  • Tests were run on a 16:10 1900×1200 monitor with the Test Suite application maximised.

For SciChart’s WPF Chart, the CPU and RAM speed are the biggest influencers on overall performance. For the DirectX Renderer the GPU becomes more significant. Please note, that we get decent numbers on a dual-core laptop. You don’t need a powerful machine to get good results, but it helps (especially for DirectX).

// Test Results

We’ve included the test results below:

In Table Form

SciChart DirectX vs. Software Performance Comparison

SciChart v4 DirectX vs HighSpeed (software) vs High Quality (software) performance comparison showing FPS (Refresh Rate). High numbers are better!

In Chart Form

No WPF Charting Performance Test could be complete without some results in charting form! So, we’ve included the above results as a set of bar charts 
						<br />
						<a class=Read more »




Jul
17
Customer Case Study – BlueShift ONE System
Posted by Andrew on 17 July 2014 03:15 PM

Recently one of our earliest customers post a case-study about SciChart on their blog. We are re-posting here as a case study with permission from BlueShift. Thanks BlueShift! We love working with you! – SciChart Team

Customer Case Study – BlueShift ONE

From the developers: BlueShift’s ONE System brings together account management, demand planning and financial control to deliver a unified forecast from volume down to gross margin. Their software provides a “one number” approach to promotional plans management and demand forecasting, and it is in this respect that SciChart plays such an important role to provide transparency and flexibility in performing these roles.”

A great charting tool is a necessity to allow us to provide users of the ONE solution with information in a fast and efficient manner. We spent quite a while in analysis, pushing the best third-party charting software to their limits before deciding on implementing the SciChart tool.

We spent quite a while in analysis, pushing the best third-party charting software to their limits before deciding on implementing the SciChart tool.

SciChart is an extremely fast and flexible solution for building charts of many different varieties, and it has allowed us to construct graphs that users can interact with and customise on the fly in a number of ways. With the ability to render millions of data points in an endless array of measures, we can show live data from our database to users nearly instantly.

BlueShift-One_1

In the image above we can identify a chart containing various measures, a legend that has been customised to show aggregated data over a selection made by the user, and a Time Navigator underneath to give the user context on what time range they are currently seeing. We have taken advantage of SciChart’s “ChartModifier” extensions to write our own modifications that specify exactly how the chart behaves when interacted with by the mouse or keyboard. The result is a chart that the user can very accurately pan and zoom, make selections over (for editing or analysis), and other proprietary/secret BlueShift stuff.

With the ability to render millions of data points in an endless array of measures, we can show live data from our database to users nearly instantly.

Our charts support user run-time specification around the measures that are shown and in what order. Options around the style of each measure range from whether it is a line, column or area type, to what colour it is, how thick it is and whether the line is dashed are also available. The chart above also supports two Y-axes, and it also allows aggregation to different time levels.

BlueShift-One_2

In this image we see a very different chart that shows data points that have been customised to display a different type of circle based on a summary of the data under that period. To get this result, we had to blend a set of tools, including the ScatterChart renderable series, hard-coded Y-axes ranges and a suite of optimised images to represent each type of summary circle.

As far as visually representing information in meaningful ways, we are very pleased with how seamlessly our charts integrate with the rest of the ONE system.

 

Would you like a case-study linking back to your website? Contact-us as we would love to hear from you!
[SciChart Team]


Read more »




CONTACT US

Not sure where to start? Contact us, we are happy to help!


CONTACT US

SciChart Ltd, 16 Beaufort Court, Admirals Way, Docklands, London, E14 9XL. Email: Legal Company Number: 07430048, VAT Number: 101957725