RSS Feed
News
Dec
30
How Fast is SciChart’s WPF Chart? DirectX vs. Software Comparison
Posted by Andrew on 30 December 2014 07:25 PM

// TLDR;

Test results demonstrate that SciChart’s Direct3D10RenderSurface, available in the SciChart Enterprise and SDK Editions, is up to 18x faster than software rendering, and median speed increase is 4x faster than software. In a variety of test results and chart types SciChart’s DirectX Renderer plugin excelled in performance vs. software!

// Measuring WPF Chart Performance

WPF Chart Performance is really important to us at SciChart. We strive to have the best, and the fastest WPF Chart components in the world, but how do we demonstrate what we’ve achieved vs. our competitors to would-be customers?

We’ve created an application which measures our WPF Chart Performance in a number of scenarios, such as line, scatter, scrolling-line and many-series scenarios. The tests are designed to really stress the chart control and find it’s limits.

There there are several factors which influence overall performance of a WPF Chart Control. Such as:

  • Number of Series in the chart
  • Types of Series (e.g. Line, Scatter, Area, Candlestick)
  • Number of data-points per series
  • Rate of change of data (number of points appended or removed per second)
  • Thickness of pens / number of pixels filled
  • Size of the chart viewport on screen
  • Number of transformations between data & pixel coordinates per second
  • Any additional calculations like Logarithmic Axis scaling or Auto Ranging

These are reflected in the test cases and show stress on different areas of the renderer.

// The Test Setup

Test Setup Configuration

As you may know, SciChart ships with several RenderSurface plugins including the HighSpeedRenderSurface, HighQualityRenderSurface, and now, the Direct3D10RenderSurface, a DirectX10/11 harware-accelerated RenderSurface implementation.

We’ve tested the latest release of SciChart: v4, and compared the three renderer plugins across a number of different scenarios in order to highlight the difference in performance of the SciChart RenderSurface implementations.

What Tests are Run

A number of tests are run in our WPF Chart Performance Comparison app which stress different areas of the renderer. These tests are designed to really stress the chart, by having huge numbers of series, or points, or many updates per second.

A certain test is run, and FPS, or refresh rate, is measured via the SciChartSurface.Rendered event, which fires once after each drawing pass completes.

The test cases are as follows:

Test 1: NxM Series Test

SciChart Performance Test 1 - NxM Series

SciChart Performance Test1: NxM Series

N series of M points are appended to an XyDataSeries, then the chart redraws (the same data) as fast as possible for 20 seconds per test. FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Iterating Data Series, Coordinate Transformation and Drawing.

NOTE: Resampling is not applicable for short data-series (below a few thousand points) so this tests iterating over many data-series, coordinate transformation as well as raw drawing speed.

Test 2: Scatter Series Test

SciChart Performance Test 2 - Scatter Series

SciChart Performance Test2: Scatter Series

N Scatter points are appended to a XyDataSeries, then the chart redraws. Immediately after, the points are updated in a Brownian motion fashion and the chart is drawn again. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Coordinate Transformation, Geometry Generation (Ellipse) and Drawing.

NOTE: Resampling is not applicable for Scatter Series as our standard resampling algorithms do not work with scatter data, so this test effectively tests the geometry generation and drawing speed.

Test 3: FIFO Series Test

SciChart Performance Test 3 - Fifo Series

SciChart Performance Test3: Fifo Series

N points are appended to a FIFO (circular buffer) series, then a single point is appended (and one dropped), which triggers a redraw of the chart. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Copying Circular Buffers (FIFO Series), Resampling and Drawing.

NOTE: Because the data is random (0-1) lots of pixels are drawn. A heavier emphasis on drawing occurs in this test than resampling, although at higher point counts resampling starts to kick in

Test 4: Append Data Test

SciChart Performance Test 4 - Append (NoiseFactor=0)

SciChart Performance Test4: Append (NoiseFactor=0)

SciChart Performance Test 4 - Append (NoiseFactor=100)

SciChart Performance Test4: Append (NoiseFactor=100)

SciChart Performance Test 4 - Append (NoiseFactor=1000)

SciChart Performance Test4: Append (NoiseFactor=1000)

N/3 points are appended to 3 DataSeries, then M points are appended between each draw of the chart. The data is random-walk but we vary noise to create more, or less noisy waves. This has the effect of stressing drawing more (when more noisy) vs. resampling more (when less noisy).

Areas stressed: Appending Data, Resampling, Auto-Ranging and Drawing.

NOTE: As the noise factor is increased, more pixels must be drawn, which stresses the rendering (drawing) more. For lower noise, resampling is the dominant influence on performance.

Test Setup Hardware

  • For the Test Setup we have used a desktop workstation with
    • Intel i7-5820K 3.3GHz
    • 32GB DDR3-2400 RAM
    • 2GB NVidia GTX760 video card
    • Windows 8.1 Professional
  • Tests were run on a 16:10 1900×1200 monitor with the Test Suite application maximised.

For SciChart’s WPF Chart, the CPU and RAM speed are the biggest influencers on overall performance. For the DirectX Renderer the GPU becomes more significant. Please note, that we get decent numbers on a dual-core laptop. You don’t need a powerful machine to get good results, but it helps (especially for DirectX).

// Test Results

We’ve included the test results below:

In Table Form

SciChart DirectX vs. Software Performance Comparison

SciChart v4 DirectX vs HighSpeed (software) vs High Quality (software) performance comparison showing FPS (Refresh Rate). High numbers are better!

In Chart Form

No WPF Charting Performance Test could be complete without some results in charting form! So, we’ve included the above results as a set of bar charts 
						<br />
						<a class=Read more »




Nov
4
Site-Wide Search, powered by OneSearch™ now available
Posted by Andrew on 04 November 2014 10:40 AM

SciChart now features an awesome new site-wide search! Powered by OneSearch™

OneSearch is a joint venture between ABT Software and Shaw Thing Technology. Check it out!

OneSearch Multiple-Site Integrated Search Engine

OneSearch™ uses the open source elastic search engine and proprietary web-crawlers. OneSearch™ is targeted as a private multi-site intranet or internet search engine where you need a high quality search results across multiple sites, delivered by a single, unified search and suggestions API.

We are now able to crawl multiple documentation sources to provide one search engine. The front-end is powered by Ajax and provides autocomplete suggestions as the user types.

OneSearch - providing a single search API to multiple web sources

We have set it up on the SciChart website to crawl the following web-sources

Note you can also access the site-wide search via the lightweight front end at http://search.scichart.com.

Give it a try and let us know if it helped you find what you were looking for!

Best regards
[SciChart HQ]

The post Site-Wide Search, powered by OneSearch™ now available appeared first on SciChart.


Read more »



Aug
27
Manifesto for Agile Software Development
Posted by Andrew on 27 August 2014 02:25 PM

I was reading a great code-rant on Agile Software Development, the sensationally titled Why Agile has failed! and stumbled upon the Agile Software Development manifesto. What is Agile? Well, the above article demonstrates what it is not. It’s not this:

Agile by Dilbert: http://dilbert.com/

Agile by Dilbert: http://dilbert.com/

Curiously, I’ve never read the Agile Software Development manifesto before, but here it is. It’s 4 lines:

1. Individuals and interactions over processes and tools
2. Working software over comprehensive documentation
3. Customer collaboration over contract negotiation
4. Responding to change over following a plan

Are you doing this in your organisation? It’s certainly something we are striving to do! Sometimes successfully, sometimes unsuccessfully. Hopefully,we are getting better as time goes on.

How do we ‘do Agile’?

A few months ago we wrote an article on How we Handle Support & the Roadmap. In it we said:

Sometimes a feature has to be backed out at the last minute because quality is not high enough … We take care that every release is of high quality. We don’t want to ruin our hard-won reputation as the Best WPF Chart by releasing shoddy software … So, great care is taken that each feature works, is tested and documented, either by KB article, release note or example.

We prefer to document by example as its live, you can see it, play with it, we can test-it each release, you can browse the source-code and it never gets out of date (you’ll never get a compilation error from out of date docs).

When I re-read this, its closer to the Agile Software Development Manifesto (and further from Dilbert-Agile) than I first realised!

 

How we implement … 

Individuals and Interactions

We promote direct interaction between customer and developers. In our teams we strive to communicate with each other, and share knowledge rapidly to best solve a problem.

Working Software over Comprehensive Documentation

We prefer to document by example, and iterate fast on features and improvements. By delivering fixes to you via a NuGet Nightly Build you can sometimes get the features and bug fixes you need within 24-hours! Guess what though, we don’t have a 250 page PDF user-manual, although we do have a living, growing KnowledgeBase.

Customer Collaboration

Did you know some of our customers submit fixes or improvements to the source which we include in the build? We work closely with customers to ensure they get what they want and we collaborate directly with you to create SciChart.

Responding to Change

Sometimes a feature has to be backed out, sometimes a deadline has to move. Sometimes it just can’t be done. Sometimes, it all needs to be dropped, and something else needs to be done instead! We are not rigid. You have to have a plan, sure, but you have to be ready to tear it up too.

 

So, hopefully, we are more agile than we first thought :) Sure, we have a few tools and processes too. We use JetBrains YouTrack, TeamCity and ProGet to automate continuous delivery, we have tech-support Dave, what else do you need to create a successful software business?!


Read more »




CONTACT US

Not sure where to start? Contact us, we are happy to help!


CONTACT US

SciChart Ltd, 16 Beaufort Court, Admirals Way, Docklands, London, E14 9XL. Email: Legal Company Number: 07430048, VAT Number: 101957725