Hi,
I'm working on a script that requires high speed. I have one script running in a virtual machine, another one in Windows 10. They go both with around 20 loops per second and in those loops, quite some activity in being handled. Communication between the two environments goes via the shared folder.
I have log files, like below. The numbers are milliseconds. So I keep track of how log each subroutine takes.
In general I can get the speed that I need. But sometimes, some instructions take hundreds of milliseconds instead of a few milliseconds. For instance writing of the log file. But also other stuff that in all other cases needs only milliseconds, suddenly takes longer, and almost always about 300ms extra. So 5ms normal, 300ms with hick up.
It looks like windows decides that no time slot is available for my script every now and then. The CPU activity is 60...70% when all my stuff is running. When an instruction takes longer than expected, it is usual around 300ms extra.
I try of course to limit the demands that my script puts on the CPU. But is there another way to prevent that all of a sudden a hick up occurs of 300ms? The script really stops for 300ms then and that's a problem if you want to loop 20/second....
I rather have that 300ms spread out than at one moment. But how to achieve that....
Kind regards,
Remco
Log example with everything going okay:
73450 ReadEngineOut: String:
73450 ReadEngineOut: CandidateThinkingTime: 10
73450 ReadEngineOut: First output in: 188ms (mean: 196ms min/max: 188ms/203ms)
73450 ReadEngineOut: TimeDiffLoc: 188 TimeDiffSum: 391 TimeDiffCounter: 2 TimeDiffAverage 196
73450 ReadEngineOut: TimeDiffLocMIN: 188 TimeDiffLocMAX: 203
73481 MakeMoveForMe: Locked: 0 HelpMe: 0 RealOutput: 1 Help: 1 JustAnswered: 0
73495 GetMove: Before CheckForEmptyLine
73504 GetMove: GetTextQuick: 1
73505 Read: BeforeGetTextAtPoint
73506 Read: NewResultTemp: 2 ItemsTemp: 2 ReadAttempts: 1
73506 Read: x: 827 y: 444
Hick ups that deteriorate script performance
Moderators: JRL, Dorian (MJT support)
-
- Newbie
- Posts: 12
- Joined: Wed Jul 06, 2016 12:19 pm
- Marcus Tettmar
- Site Admin
- Posts: 7395
- Joined: Thu Sep 19, 2002 3:00 pm
- Location: Dorset, UK
- Contact:
Re: Hick ups that deteriorate script performance
All this is going to be dependent on the operating system, and where you are doing file read/writes then of course the time that takes is up to the OS. You could perhaps try increasing the process privilege level in Task Manager?
Marcus Tettmar
http://mjtnet.com/blog/ | http://twitter.com/marcustettmar
Did you know we are now offering affordable monthly subscriptions for Macro Scheduler Standard?
http://mjtnet.com/blog/ | http://twitter.com/marcustettmar
Did you know we are now offering affordable monthly subscriptions for Macro Scheduler Standard?
-
- Newbie
- Posts: 12
- Joined: Wed Jul 06, 2016 12:19 pm
Re: Hick ups that deteriorate script performance
Hi,
Thanks, I will try that.
If writing to the log file or writing/reading to other files, sometimes take much longer than normal, could it be useful to place those files on a RAMdrive instead of a normal disk?
Kind regards,
Remco
Thanks, I will try that.
If writing to the log file or writing/reading to other files, sometimes take much longer than normal, could it be useful to place those files on a RAMdrive instead of a normal disk?
Kind regards,
Remco
- Marcus Tettmar
- Site Admin
- Posts: 7395
- Joined: Thu Sep 19, 2002 3:00 pm
- Location: Dorset, UK
- Contact:
Re: Hick ups that deteriorate script performance
Logically that makes sense. But I guess you'll have to try it to know for sure.
Marcus Tettmar
http://mjtnet.com/blog/ | http://twitter.com/marcustettmar
Did you know we are now offering affordable monthly subscriptions for Macro Scheduler Standard?
http://mjtnet.com/blog/ | http://twitter.com/marcustettmar
Did you know we are now offering affordable monthly subscriptions for Macro Scheduler Standard?
-
- Pro Scripter
- Posts: 70
- Joined: Sun May 03, 2009 11:49 pm
- Location: AU
Re: Hick ups that deteriorate script performance
Hi,
I had a similar performance issue that I thought related to the scripts. Then realized the .exe scripts were at their optimal.
Then I thought maybe the OS and it's privileges had something to do with it.
Which turn out to be the case and also share drive load.
So, what the solution ended been was this:
> Run all .exe scripts locally from each machine (preferably from a folder in the C:\ drive not monitored by antivirus)
> Read and write reports locally throughout the .exe scripts run
> End of run copy the report file to the share drive
This dramatically improved the performance of my .exe scripts by minutes.
Issues were:
>Network load when multiply users were running the .exe scripts from the same location (share drive)
>Reading and writing files to a share with high load increased latency and hence time to complete task
I had a similar performance issue that I thought related to the scripts. Then realized the .exe scripts were at their optimal.
Then I thought maybe the OS and it's privileges had something to do with it.
Which turn out to be the case and also share drive load.
So, what the solution ended been was this:
> Run all .exe scripts locally from each machine (preferably from a folder in the C:\ drive not monitored by antivirus)
> Read and write reports locally throughout the .exe scripts run
> End of run copy the report file to the share drive
This dramatically improved the performance of my .exe scripts by minutes.
Issues were:
>Network load when multiply users were running the .exe scripts from the same location (share drive)
>Reading and writing files to a share with high load increased latency and hence time to complete task
Loving MS's Capabilities!!!