SciChart – the Value of Priority Support!
Posted by Andrew on 17 March 2015 02:18 PM
Recently in October 2014 we introduced a new support policy, where we decided to give priority support tickets to SciChart WPF/SL Professional and Source-code customers. SciChart WPF Basic and trial users would still receive support, but via the forums. This is part of an ongoing process of continually improving and refinining the tech support we provide, as we feel it is such a critical part of our business.
This article is just some feedback of what we’ve learned from this experience and also to demonstrate the value-add of priority support to our customers!
Why is Tech Support so Important?
Have you ever bought a component from a vendor, and they might have a great 200 page document on how to use it, but for some reason it just doesn’t make sense to you? There’s a learning curve to climb when purchasing a new component and nothing is more frustrating than having a problem, and a deadline / angry boss / impatient customer <delete as applicable>, writing an email to their support and getting nothing back … Have you ever been there?
I personally have, and it’s disappointing… There is nothing that leaves a bad taste in your mouth more than a company that takes your money, but doesn’t respond when you need help.
We personally try to model ourselves on Telerik. Yes! I said a competitor name! I know … not a good idea.. but I believe we’re non-overlapping competitors with Telerik and personally, having used their docking components in 2008-2009 when working as a WPF Developer I realized they did something really well – tech support.
I noticed from using Telerik components that if you had a problem:
This is a model we have emulated and it has paid dividends to us. Most of our business comes from referrals! Enough said …
How Many Support Requests do we get?
Since introducing the support desk at http://support.scichart.com one year ago, we have resolved 1,360 support requests, 616 sales requests with an average feedback rating of 4.6/5.0!
Each ticket has an average of 5 replies before the ticket is considered resolved, so in a year we’ve basically sent and received over 10,000 emails related to tech-support. That’s a pretty incredible rate and the feedback score and is consistent over the year. We are proud of how much tech-support we’ve handled and how we’ve delivered it, but more importantly, we’re really pleased to see how much value tech-support adds to our business. It’s taught us a lot …
How Quickly do we Respond to Support Requests?
We advertise that we respond to support requests sub-day, e.g. 24 hours to first response. This does not mean that we can resolve all problems in this timeframe, nor do we gaurantee to respond within this time but we aim to respond and at least make some progress to resolving within one business day. Over the past year this is what our support-desk says about our response time:
We’re quick, and we really care. We want to help you to use our software, because if you can use our software, you get great value out of it, and if you see value in it, you’re more likely to tell other people about us. In other words, our business model is centered around customer service.
What do our Users Say?
Every time a support request is resolved (marked as closed by staff or user) the customer has a chance to give us feedback. Over the past year we’ve collected over 300 feedback responses, with an average rating of 4.6/5. That’s a 90% customer happiness score!
Some of the comments are really encouraging. When customers give feedback scores & leave a comment it is emailed directly to the team cc the Company Director. Here are a selection of anonymised comments below:
OH YEAH! Love you guys!
Similarly, we receive negative feedback via the ticket response ratings. If you want to encourage us, or, give us a slap, the ticket survey response is the place to do it. These surveys go straight to the team CC the company director, so we do hear them!
What about the Trial Users / Basic Customers on the Forums?
We also have a public forum at www.scichart.com/questions. Here we aim to respond within 3 business days, but often sooner, depending on our workload. The conversation time is typically a lot shorter, it tends to be question & single answer or at most two or three answers. The good thing about the forums for us is they are google indexed, so if you have a how-to question this is the place to put it. It helps us to build a searchable knowledgebase!
We encourage you to use the ratings here as voting questions up/down puts them higher or lower in the search results. Also we love to see the public knowledgebase grow as every question asked becomes a search result for someone later
The Value-Add of SciChart Priority Tech Support
Our goal when introducing the support-policy in October 2014 was three-fold:
These goals have been achieved and more. In fact, we found that by introducing the support-policy our support load has decreased significantly, customer satisfaction has gone up, and importantly for us, sales are still strong. It hasn’t put anyone off excluding trial customers from support tickets, far from it, we are seeing more and more referrals.
We hope this has been useful information to you, and if you’re a competitor and you’ve read this far, we are available to consult to create a state-of-the-art support model for your business. Just joking. We don’t have time for that, we’re focusing on our own customers and product development Read more »
How Fast is SciChart’s WPF Chart? DirectX vs. Software Comparison
Posted by Andrew on 30 December 2014 07:25 PM
Test results demonstrate that SciChart’s Direct3D10RenderSurface, available in the SciChart Enterprise and SDK Editions, is up to 18x faster than software rendering, and median speed increase is 4x faster than software. In a variety of test results and chart types SciChart’s DirectX Renderer plugin excelled in performance vs. software!
// Measuring WPF Chart Performance
WPF Chart Performance is really important to us at SciChart. We strive to have the best, and the fastest WPF Chart components in the world, but how do we demonstrate what we’ve achieved vs. our competitors to would-be customers?
We’ve created an application which measures our WPF Chart Performance in a number of scenarios, such as line, scatter, scrolling-line and many-series scenarios. The tests are designed to really stress the chart control and find it’s limits.
There there are several factors which influence overall performance of a WPF Chart Control. Such as:
These are reflected in the test cases and show stress on different areas of the renderer.
// The Test Setup
Test Setup Configuration
As you may know, SciChart ships with several RenderSurface plugins including the HighSpeedRenderSurface, HighQualityRenderSurface, and now, the Direct3D10RenderSurface, a DirectX10/11 harware-accelerated RenderSurface implementation.
We’ve tested the latest release of SciChart: v4, and compared the three renderer plugins across a number of different scenarios in order to highlight the difference in performance of the SciChart RenderSurface implementations.
What Tests are Run
A number of tests are run in our WPF Chart Performance Comparison app which stress different areas of the renderer. These tests are designed to really stress the chart, by having huge numbers of series, or points, or many updates per second.
A certain test is run, and FPS, or refresh rate, is measured via the SciChartSurface.Rendered event, which fires once after each drawing pass completes.
The test cases are as follows:
Test 1: NxM Series Test
N series of M points are appended to an XyDataSeries, then the chart redraws (the same data) as fast as possible for 20 seconds per test. FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.
Areas Stressed: Iterating Data Series, Coordinate Transformation and Drawing.
NOTE: Resampling is not applicable for short data-series (below a few thousand points) so this tests iterating over many data-series, coordinate transformation as well as raw drawing speed.
Test 2: Scatter Series Test
N Scatter points are appended to a XyDataSeries, then the chart redraws. Immediately after, the points are updated in a Brownian motion fashion and the chart is drawn again. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.
Areas Stressed: Coordinate Transformation, Geometry Generation (Ellipse) and Drawing.
NOTE: Resampling is not applicable for Scatter Series as our standard resampling algorithms do not work with scatter data, so this test effectively tests the geometry generation and drawing speed.
Test 3: FIFO Series Test
N points are appended to a FIFO (circular buffer) series, then a single point is appended (and one dropped), which triggers a redraw of the chart. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.
Areas Stressed: Copying Circular Buffers (FIFO Series), Resampling and Drawing.
NOTE: Because the data is random (0-1) lots of pixels are drawn. A heavier emphasis on drawing occurs in this test than resampling, although at higher point counts resampling starts to kick in
Test 4: Append Data Test
N/3 points are appended to 3 DataSeries, then M points are appended between each draw of the chart. The data is random-walk but we vary noise to create more, or less noisy waves. This has the effect of stressing drawing more (when more noisy) vs. resampling more (when less noisy).
Areas stressed: Appending Data, Resampling, Auto-Ranging and Drawing.
NOTE: As the noise factor is increased, more pixels must be drawn, which stresses the rendering (drawing) more. For lower noise, resampling is the dominant influence on performance.
Test Setup Hardware
For SciChart’s WPF Chart, the CPU and RAM speed are the biggest influencers on overall performance. For the DirectX Renderer the GPU becomes more significant. Please note, that we get decent numbers on a dual-core laptop. You don’t need a powerful machine to get good results, but it helps (especially for DirectX).
// Test Results
We’ve included the test results below:
In Table Form
In Chart Form
No WPF Charting Performance Test could be complete without some results in charting form! So, we’ve included the above results as a set of bar charts Read more »
Customer Case Study – BlueShift ONE System
Posted by Andrew on 17 July 2014 03:15 PM
Recently one of our earliest customers post a case-study about SciChart on their blog. We are re-posting here as a case study with permission from BlueShift. Thanks BlueShift! We love working with you! – SciChart Team
Customer Case Study – BlueShift ONE
From the developers: “BlueShift’s ONE System brings together account management, demand planning and financial control to deliver a unified forecast from volume down to gross margin. Their software provides a “one number” approach to promotional plans management and demand forecasting, and it is in this respect that SciChart plays such an important role to provide transparency and flexibility in performing these roles.”
A great charting tool is a necessity to allow us to provide users of the ONE solution with information in a fast and efficient manner. We spent quite a while in analysis, pushing the best third-party charting software to their limits before deciding on implementing the SciChart tool.
SciChart is an extremely fast and flexible solution for building charts of many different varieties, and it has allowed us to construct graphs that users can interact with and customise on the fly in a number of ways. With the ability to render millions of data points in an endless array of measures, we can show live data from our database to users nearly instantly.
In the image above we can identify a chart containing various measures, a legend that has been customised to show aggregated data over a selection made by the user, and a Time Navigator underneath to give the user context on what time range they are currently seeing. We have taken advantage of SciChart’s “ChartModifier” extensions to write our own modifications that specify exactly how the chart behaves when interacted with by the mouse or keyboard. The result is a chart that the user can very accurately pan and zoom, make selections over (for editing or analysis), and other proprietary/secret BlueShift stuff.
Our charts support user run-time specification around the measures that are shown and in what order. Options around the style of each measure range from whether it is a line, column or area type, to what colour it is, how thick it is and whether the line is dashed are also available. The chart above also supports two Y-axes, and it also allows aggregation to different time levels.
In this image we see a very different chart that shows data points that have been customised to display a different type of circle based on a summary of the data under that period. To get this result, we had to blend a set of tools, including the ScatterChart renderable series, hard-coded Y-axes ranges and a suite of optimised images to represent each type of summary circle.
As far as visually representing information in meaningful ways, we are very pleased with how seamlessly our charts integrate with the rest of the ONE system.
Would you like a case-study linking back to your website? Contact-us as we would love to hear from you!
Read more »