RSS Feed
News
Mar
17
SciChart – the Value of Priority Support!
Posted by Andrew on 17 March 2015 02:18 PM

Recently in October 2014 we introduced a new support policy, where we decided to give priority support tickets to SciChart WPF/SL Professional and Source-code customers. SciChart WPF Basic and trial users would still receive support, but via the forums. This is part of an ongoing process of continually improving and refinining the tech support we provide, as we feel it is such a critical part of our business.

This article is just some feedback of what we’ve learned from this experience and also to demonstrate the value-add of priority support to our customers!

Why is Tech Support so Important?

Have you ever bought a component from a vendor, and they might have a great 200 page document on how to use it, but for some reason it just doesn’t make sense to you? There’s a learning curve to climb when purchasing a new component and nothing is more frustrating than having a problem, and a deadline / angry boss / impatient customer <delete as applicable>, writing an email to their support and getting nothing back … Have you ever been there?

Since introducing the support desk at http://support.scichart.com one year ago, we have resolved over 1,300 support requests, 700 sales requests with an average feedback rating of 4.6/5.0! 

I personally have, and it’s disappointing… There is nothing that leaves a bad taste in your mouth more than a company that takes your money, but doesn’t respond when you need help.

We personally try to model ourselves on Telerik. Yes! I said a competitor name! I know … not a good idea.. but I believe we’re non-overlapping competitors with Telerik and personally, having used their docking components in 2008-2009 when working as a WPF Developer I realized they did something really well – tech support.

I noticed from using Telerik components that if you had a problem:

  • You posted it on the forums
  • You got an answer within a day or two from somebody competent
  • Usually you had to provide a solution to reproduce … which was annoying but …
  • … when they had information to reproduce the problem, they fixed it, or offered a workaround pretty quickly.

In other words, I could get my job done and continue on with my day. That is a Value-Add

This is a model we have emulated and it has paid dividends to us. Most of our business comes from referrals! Enough said …

How Many Support Requests do we get?

Since introducing the support desk at http://support.scichart.com one year ago, we have resolved 1,360 support requests, 616 sales requests with an average feedback rating of 4.6/5.0!

SciChart Support - Total Ticket Count by Department

SciChart Support – Total Ticket Count by Department

SciChart Support - Average Replies to Resolved

SciChart Support – Average Replies before Ticket Resolved

Each ticket has an average of 5 replies before the ticket is considered resolved, so in a year we’ve basically sent and received over 10,000 emails related to tech-support. That’s a pretty incredible rate and the feedback score and is consistent over the year. We are proud of how much tech-support we’ve handled and how we’ve delivered it, but more importantly, we’re really pleased to see how much value tech-support adds to our business. It’s taught us a lot …

How Quickly do we Respond to Support Requests?

We’re quick, and we really care.

We advertise that we respond to support requests sub-day, e.g. 24 hours to first response. This does not mean that we can resolve all problems in this timeframe, nor do we gaurantee to respond within this time but we aim to respond and at least make some progress to resolving within one business day. Over the past year this is what our support-desk says about our response time:

SciChart Support - Average Time to First Response

SciChart Support – Average Time to First Response

SciChart Support - Maximum Time to First Response

SciChart Support – Maximum Time to First Response

  • Average time to first response (per staff member, per month) was always less than one-day
  • In some cases average response time was less than one hour!
  • Maximum response time was typically 3-days. Don’t forget we don’t support on weekends (business days only), but this report includes Monday-Sunday
  • Our record lowest response & resolution was 4 minutes

We’re quick, and we really care. We want to help you to use our software, because if you can use our software, you get great value out of it, and if you see value in it, you’re more likely to tell other people about us. In other words, our business model is centered around customer service.

What do our Users Say?

Every time a support request is resolved (marked as closed by staff or user) the customer has a chance to give us feedback. Over the past year we’ve collected over 300 feedback responses, with an average rating of 4.6/5. That’s a 90% customer happiness score!

SciChart Priority Support - Average Feedback Rating by Staff Member

SciChart Support – Average Feedback Rating by Staff Member

Some of the comments are really encouraging. When customers give feedback scores & leave a comment it is emailed directly to the team cc the Company Director. Here are a selection of anonymised comments below:

“The ticket was really fast answered. All open items could be closed with customer satisfaction. Thank you again!” (Thomas, 5/5, Feb 2015)

SciChart always provides great feedback and service – A+” (Phillip, 5/5, Jan 2015)

You guys rock… I’ve only just loaded up the source but from what I can see it’s very clean and well organized. I’ll be spending time this weekend working on the modified data series. I’ll report back as I progress… thanks again, you guys are really great to deal with and sciCharts really rocks!” (Geoff, 5/5, Jan 2015)

“Thanks for prompt response and clear explanation. From your advice, I also found that there are useful resources for using sciChart in support page.” (Jack, 5/5, Jan 2015)

SciChart has absolutely excellent customer support. Every time I have contacted you, my questions have been answered and issues resolved quickly and professionally. This time was no exception. No need for any improvement.” (William, 5/5, Dec 2014)

“Bug reports made for problems found. Acceptable workarounds provided. Super support” (Bob, 5/5 Dec 2014)

“Pleased with response. Problem confirmed and fixed within 24 hours” (Ian, 5/5, Dec 2014)

I am very impressed. It appears to be what we need for our application. When zooming in we would prefer exponents with floating point, this appears to be the behavior you implemented. Other customers may prefer linear behavior, it would be nice to give them the option.” (Vincent, 5/5, Sept 2014)

“The response was so fast I would like to give you 6 stars” (Mark, 5/5, Nov 2014)

OH YEAH! Love you guys!

Similarly, we receive negative feedback via the ticket response ratings. If you want to encourage us, or, give us a slap, the ticket survey response is the place to do it. These surveys go straight to the team CC the company director, so we do hear them!

What about the Trial Users / Basic Customers on the Forums?

We also have a public forum at www.scichart.com/questionsHere we aim to respond within 3 business days, but often sooner, depending on our workload. The conversation time is typically a lot shorter, it tends to be question & single answer or at most two or three answers. The good thing about the forums for us is they are google indexed, so if you have a how-to question this is the place to put it. It helps us to build a searchable knowledgebase!

SciChart Support - Public Searchable Forums

SciChart Support – Public Searchable Forums

SciChart Support - Forums are Searchable on Google

SciChart Support – Forums are Searchable!

We encourage you to use the ratings here as voting questions up/down puts them higher or lower in the search results. Also we love to see the public knowledgebase grow as every question asked becomes a search result for someone later

The Value-Add of SciChart Priority Tech Support

In conclusion,

Our goal when introducing the support-policy in October 2014 was three-fold:

  • To reduce our support-load (limit priority support to paid customers, in subscription)
  • To allows us to focus on providing excellent service
  • To increase the value of our higher-end products

These goals have been achieved and more. In fact, we found that by introducing the support-policy our support load has decreased significantly, customer satisfaction has gone up, and importantly for us, sales are still strong. It hasn’t put anyone off excluding trial customers from support tickets, far from it, we are seeing more and more referrals.

We hope this has been useful information to you, and if you’re a competitor and you’ve read this far, we are available to consult to create a state-of-the-art support model for your business. Just joking. We don’t have time for that, we’re focusing on our own customers and product development 
						<br />
						<a class=Read more »




Dec
30
How Fast is SciChart’s WPF Chart? DirectX vs. Software Comparison
Posted by Andrew on 30 December 2014 07:25 PM

// TLDR;

Test results demonstrate that SciChart’s Direct3D10RenderSurface, available in the SciChart Enterprise and SDK Editions, is up to 18x faster than software rendering, and median speed increase is 4x faster than software. In a variety of test results and chart types SciChart’s DirectX Renderer plugin excelled in performance vs. software!

// Measuring WPF Chart Performance

WPF Chart Performance is really important to us at SciChart. We strive to have the best, and the fastest WPF Chart components in the world, but how do we demonstrate what we’ve achieved vs. our competitors to would-be customers?

We’ve created an application which measures our WPF Chart Performance in a number of scenarios, such as line, scatter, scrolling-line and many-series scenarios. The tests are designed to really stress the chart control and find it’s limits.

There there are several factors which influence overall performance of a WPF Chart Control. Such as:

  • Number of Series in the chart
  • Types of Series (e.g. Line, Scatter, Area, Candlestick)
  • Number of data-points per series
  • Rate of change of data (number of points appended or removed per second)
  • Thickness of pens / number of pixels filled
  • Size of the chart viewport on screen
  • Number of transformations between data & pixel coordinates per second
  • Any additional calculations like Logarithmic Axis scaling or Auto Ranging

These are reflected in the test cases and show stress on different areas of the renderer.

// The Test Setup

Test Setup Configuration

As you may know, SciChart ships with several RenderSurface plugins including the HighSpeedRenderSurface, HighQualityRenderSurface, and now, the Direct3D10RenderSurface, a DirectX10/11 harware-accelerated RenderSurface implementation.

We’ve tested the latest release of SciChart: v4, and compared the three renderer plugins across a number of different scenarios in order to highlight the difference in performance of the SciChart RenderSurface implementations.

What Tests are Run

A number of tests are run in our WPF Chart Performance Comparison app which stress different areas of the renderer. These tests are designed to really stress the chart, by having huge numbers of series, or points, or many updates per second.

A certain test is run, and FPS, or refresh rate, is measured via the SciChartSurface.Rendered event, which fires once after each drawing pass completes.

The test cases are as follows:

Test 1: NxM Series Test

SciChart Performance Test 1 - NxM Series

SciChart Performance Test1: NxM Series

N series of M points are appended to an XyDataSeries, then the chart redraws (the same data) as fast as possible for 20 seconds per test. FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Iterating Data Series, Coordinate Transformation and Drawing.

NOTE: Resampling is not applicable for short data-series (below a few thousand points) so this tests iterating over many data-series, coordinate transformation as well as raw drawing speed.

Test 2: Scatter Series Test

SciChart Performance Test 2 - Scatter Series

SciChart Performance Test2: Scatter Series

N Scatter points are appended to a XyDataSeries, then the chart redraws. Immediately after, the points are updated in a Brownian motion fashion and the chart is drawn again. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Coordinate Transformation, Geometry Generation (Ellipse) and Drawing.

NOTE: Resampling is not applicable for Scatter Series as our standard resampling algorithms do not work with scatter data, so this test effectively tests the geometry generation and drawing speed.

Test 3: FIFO Series Test

SciChart Performance Test 3 - Fifo Series

SciChart Performance Test3: Fifo Series

N points are appended to a FIFO (circular buffer) series, then a single point is appended (and one dropped), which triggers a redraw of the chart. The FPS is measured using the time elapsed between subsequent SciChartSurface.Rendered events.

Areas Stressed: Copying Circular Buffers (FIFO Series), Resampling and Drawing.

NOTE: Because the data is random (0-1) lots of pixels are drawn. A heavier emphasis on drawing occurs in this test than resampling, although at higher point counts resampling starts to kick in

Test 4: Append Data Test

SciChart Performance Test 4 - Append (NoiseFactor=0)

SciChart Performance Test4: Append (NoiseFactor=0)

SciChart Performance Test 4 - Append (NoiseFactor=100)

SciChart Performance Test4: Append (NoiseFactor=100)

SciChart Performance Test 4 - Append (NoiseFactor=1000)

SciChart Performance Test4: Append (NoiseFactor=1000)

N/3 points are appended to 3 DataSeries, then M points are appended between each draw of the chart. The data is random-walk but we vary noise to create more, or less noisy waves. This has the effect of stressing drawing more (when more noisy) vs. resampling more (when less noisy).

Areas stressed: Appending Data, Resampling, Auto-Ranging and Drawing.

NOTE: As the noise factor is increased, more pixels must be drawn, which stresses the rendering (drawing) more. For lower noise, resampling is the dominant influence on performance.

Test Setup Hardware

  • For the Test Setup we have used a desktop workstation with
    • Intel i7-5820K 3.3GHz
    • 32GB DDR3-2400 RAM
    • 2GB NVidia GTX760 video card
    • Windows 8.1 Professional
  • Tests were run on a 16:10 1900×1200 monitor with the Test Suite application maximised.

For SciChart’s WPF Chart, the CPU and RAM speed are the biggest influencers on overall performance. For the DirectX Renderer the GPU becomes more significant. Please note, that we get decent numbers on a dual-core laptop. You don’t need a powerful machine to get good results, but it helps (especially for DirectX).

// Test Results

We’ve included the test results below:

In Table Form

SciChart DirectX vs. Software Performance Comparison

SciChart v4 DirectX vs HighSpeed (software) vs High Quality (software) performance comparison showing FPS (Refresh Rate). High numbers are better!

In Chart Form

No WPF Charting Performance Test could be complete without some results in charting form! So, we’ve included the above results as a set of bar charts 
						<br />
						<a class=Read more »




Jul
17
Customer Case Study – BlueShift ONE System
Posted by Andrew on 17 July 2014 03:15 PM

Recently one of our earliest customers post a case-study about SciChart on their blog. We are re-posting here as a case study with permission from BlueShift. Thanks BlueShift! We love working with you! – SciChart Team

Customer Case Study – BlueShift ONE

From the developers: BlueShift’s ONE System brings together account management, demand planning and financial control to deliver a unified forecast from volume down to gross margin. Their software provides a “one number” approach to promotional plans management and demand forecasting, and it is in this respect that SciChart plays such an important role to provide transparency and flexibility in performing these roles.”

A great charting tool is a necessity to allow us to provide users of the ONE solution with information in a fast and efficient manner. We spent quite a while in analysis, pushing the best third-party charting software to their limits before deciding on implementing the SciChart tool.

We spent quite a while in analysis, pushing the best third-party charting software to their limits before deciding on implementing the SciChart tool.

SciChart is an extremely fast and flexible solution for building charts of many different varieties, and it has allowed us to construct graphs that users can interact with and customise on the fly in a number of ways. With the ability to render millions of data points in an endless array of measures, we can show live data from our database to users nearly instantly.

BlueShift-One_1

In the image above we can identify a chart containing various measures, a legend that has been customised to show aggregated data over a selection made by the user, and a Time Navigator underneath to give the user context on what time range they are currently seeing. We have taken advantage of SciChart’s “ChartModifier” extensions to write our own modifications that specify exactly how the chart behaves when interacted with by the mouse or keyboard. The result is a chart that the user can very accurately pan and zoom, make selections over (for editing or analysis), and other proprietary/secret BlueShift stuff.

With the ability to render millions of data points in an endless array of measures, we can show live data from our database to users nearly instantly.

Our charts support user run-time specification around the measures that are shown and in what order. Options around the style of each measure range from whether it is a line, column or area type, to what colour it is, how thick it is and whether the line is dashed are also available. The chart above also supports two Y-axes, and it also allows aggregation to different time levels.

BlueShift-One_2

In this image we see a very different chart that shows data points that have been customised to display a different type of circle based on a summary of the data under that period. To get this result, we had to blend a set of tools, including the ScatterChart renderable series, hard-coded Y-axes ranges and a suite of optimised images to represent each type of summary circle.

As far as visually representing information in meaningful ways, we are very pleased with how seamlessly our charts integrate with the rest of the ONE system.

 

Would you like a case-study linking back to your website? Contact-us as we would love to hear from you!
[SciChart Team]


Read more »




CONTACT US

Not sure where to start? Contact us, we are happy to help!


CONTACT US

SciChart Ltd, 16 Beaufort Court, Admirals Way, Docklands, London, E14 9XL. Email: Legal Company Number: 07430048, VAT Number: 101957725