Does Schneider recommend a specific tool for analysing 2D Graphics PassMark when spec'ing hardware for Virtual ViewX? We've been using the tools from https://www.passmark.com.au/, but have found that the scores given between versions vary wildly. v10 gives a score of almost half that for v9. They do say this is expected in their help due to changes in the tests, but that makes it pretty useless as a benchmark tool, and meaningless in terms of spec'ing hardware for Virtual ViewX...
Is there a specific tool and version that Schneider recommend? What tool and version was used when preparing the requirements for Virtual ViewX servers in the GeoSCADA Installation doco? It would be helpful to know so we can use the same to be consistent.
Does anyone have a recommendations for passmark tools that give consistent results please? Thanks 🙂
I guess that's a question for Steve on what version they used. It's common for new versions of benchmarks to knock numbers down otherwise after a while all the numbers will be so large with newer hardware its next to useless.
For home I use UserBenchMark, its portable but people on the Internet aren't fans of it (shock, its the Internet, they don't like the way it calculates the total scores), but for selective one to one comparison I find it works well.
I'm trying to track this down. However, the Passmark scoring methods don't seem to be as helpful an idea as they could be.
Given more recent performance tests (see https://community.exchange.se.com/t5/Geo-SCADA-Expert-Forum/Looking-for-anyone-who-has-deployed-Virt...), the following *may* be more useful. Note this is not official SE guidance.
The Geo SCADA Virtual ViewX Server (VVS) uses a screen virtualization technology to render ViewX content on the web browser. Each new Virtual ViewX user is given a private ViewX instance, logged in to their Geo SCADA user account. As a consequence, the web server must have enough resources to run these Virtual ViewX instances for the number of users it is licensed for, and it may be necessary to distribute Virtual ViewX users among servers differently from WebX.
We recommend that in addition to operating system memory requirements (approximately 3Gb), you allocate 200 MB RAM per client, or more for complex mimic displays.
We recommend one core per two ViewX clients, with a minimum of four cores for a server.
We recommend a minimum network connection for the server of 100 Mbps, with 1 Gbps preferred for servers hosting over 15 clients.
An SSD drive for the operating system, Program Files and Log file locations is recommended.
A modern (2018+) CPU of 2.5GHz plus is recommended, and there will be a better user experience with either greater CPU speed or graphics speed.
Your comments around this would be appreciated.
@sbeadle , thanks for this extra information. It's useful to have more background on there the recommended specs come from, but applying the guidelines in your comments pretty much gives me the same answer for required server specs as following the simple table in the GeoSCADA installation doco does. It doesn't get me any further in terms of spec'ing graphics requirements for the host... We'll be doing some testing here to ascertain whether the standard host can run 20 clients, but we cannot currently advise our client on what hardware they need to budget for the Virtual ViewX host, that's our main issue at the moment. It sounds like it's basically a 'suck it and see' situation, but it would be preferable to be able to give our customer some more solid advice on this at the moment 🙂
Thanks Howard. Solid advice would have to work in all customer situations and configurations, and this would result in over specification. Note that the text above does not specify any graphics hardware in the server. We also think that the number of cores can be proportionately reduced as the number of clients increases, maybe 3 clients per core above 10 clients. We will refine this further for documentation.
Hi @sbeadle , I can see the above does not mention graphics requirements - but that's the main unknown for us and the customer. Providing all the other requirements such as cores/memory isn't really a concern, the graphics requirements are the big unknown where we and the customer do need some guidance to help spec/budget appropriate hardware. We do need to think about how to offer some giudance. If we were further down the track then there would be more anecdotal evidence to base new deployments from, but unfortunately it seems that this isn't the case at the moment...
Discuss challenges in energy and automation with 30,000+ experts and peers.
Find answers in 10,000+ support articles to help solve your product and business challenges.
Find peer based solutions to your questions. Provide answers for fellow community members!