Data Analysis Software - Motec, Gems, AEM
Moderators: JeffC, rdoherty, stieg, brentp
-
- Posts: 8
- Joined: Wed Feb 27, 2013 2:00 pm
- Location: Trois-Rivieres, Qc, Canada
Data Analysis Software - Motec, Gems, AEM
I am used to Motec's i2 Pro software (http://www.motec.com/i2/i2overview/).
I found AEM Data Analysis software (http://www.aemelectronics.com/data-logg ... ftware-81/). They seams to have an agreement with GEMS and their data analysis software (http://www.gems.co.uk/?content=pages&id ... a-analysis).
The backend functions (ex: maths, subfunctions, interface flexibilities, interpreter, ...) of all these three softwares are identical. I don't know if they effectively use a same backend (i.e. open-source or mathlab or partnership) and over-impose different interfaces.
From what I know, since I like Motec i2 Pro, the RaceCapture Pro software would benefit of targetting i2Pro.
Yes, AEM, GEMS, Motec hardware are monetary prohibitives but their data analysis softwares are free. Their hardware only log data, without any output, thats give a serious advantage to RaceCapture Pro.
Is it possible that RCP software use the same backend as i2 Pro ? That's a good question !
I found AEM Data Analysis software (http://www.aemelectronics.com/data-logg ... ftware-81/). They seams to have an agreement with GEMS and their data analysis software (http://www.gems.co.uk/?content=pages&id ... a-analysis).
The backend functions (ex: maths, subfunctions, interface flexibilities, interpreter, ...) of all these three softwares are identical. I don't know if they effectively use a same backend (i.e. open-source or mathlab or partnership) and over-impose different interfaces.
From what I know, since I like Motec i2 Pro, the RaceCapture Pro software would benefit of targetting i2Pro.
Yes, AEM, GEMS, Motec hardware are monetary prohibitives but their data analysis softwares are free. Their hardware only log data, without any output, thats give a serious advantage to RaceCapture Pro.
Is it possible that RCP software use the same backend as i2 Pro ? That's a good question !
We planned all along for the open nature of the CSV file format in RaceCapture/Pro to ease importing into other systems - from spreadsheets to 3rd party applications.
Along with that, RaceAnalyzer software represents the basis of a fully open source software package that can be grown into a more full-featured system.
Please keep us updated on your investigation into other software packages!
thank you,
Along with that, RaceAnalyzer software represents the basis of a fully open source software package that can be grown into a more full-featured system.
Please keep us updated on your investigation into other software packages!
thank you,
I managed to open csv data from Race Capture Pro with GDA (not the pro version).
Here is how :
* Open your CSV data with excel (or another spreadsheet)
* Place the time row in first position
* Make the time start at 0 (substract each cell with the first time)
* Change each column title to have only the name with brackets (eg : "Speed")
* import your modified CSV into Gems DLog99 (freely downloadable on their website)
* save the project (this will create a .stf file)
* Open this file with the latest version of GDA
It is really a nice software !!!
Here is how :
* Open your CSV data with excel (or another spreadsheet)
* Place the time row in first position
* Make the time start at 0 (substract each cell with the first time)
* Change each column title to have only the name with brackets (eg : "Speed")
* import your modified CSV into Gems DLog99 (freely downloadable on their website)
* save the project (this will create a .stf file)
* Open this file with the latest version of GDA
It is really a nice software !!!
-
- Posts: 101
- Joined: Tue Jan 15, 2013 1:37 pm
This is great, I agree the software is much more developed, hopefully RA will eventually eclipse it!
How did you get it to deal with lap times? I can replace channels and get it to show the track map, but I can't for the life of me get it to split out laps (which is what would make it really powerful, the sector and corner analysis would be awesome).
Any tips and tricks you find (I'll post the same) would be awesome.
How did you get it to deal with lap times? I can replace channels and get it to show the track map, but I can't for the life of me get it to split out laps (which is what would make it really powerful, the sector and corner analysis would be awesome).
Any tips and tricks you find (I'll post the same) would be awesome.
To have the laptime, you need :
1. Set a distance channel in "Data > Distance channel setup" (from Speed or GPS coordinates)
2. Select where you want to place your beacon and make a right click there and select "Place GPS Lap Beacon at cursor"
3. Increase the "Beacon range" up you see a list of valid laptime.
You then have the lap list to the left !!
Small edit to my first post :
In Racecapture pro, the time is logged in the format hhmmss.000 (where 000 is millisecond). In order to import it correctly into DLog99, you need to have seconds starting from zero.
So with a bit of excel, you can make the transformation (I will post my excel sheet later).
You can then add as many widget you want, and there also is math, so you can compare the delta t between 2 laps and see where you gain/lose time.
1. Set a distance channel in "Data > Distance channel setup" (from Speed or GPS coordinates)
2. Select where you want to place your beacon and make a right click there and select "Place GPS Lap Beacon at cursor"
3. Increase the "Beacon range" up you see a list of valid laptime.
You then have the lap list to the left !!
Small edit to my first post :
In Racecapture pro, the time is logged in the format hhmmss.000 (where 000 is millisecond). In order to import it correctly into DLog99, you need to have seconds starting from zero.
So with a bit of excel, you can make the transformation (I will post my excel sheet later).
You can then add as many widget you want, and there also is math, so you can compare the delta t between 2 laps and see where you gain/lose time.
-
- Posts: 30
- Joined: Tue Aug 13, 2013 12:15 pm
- Location: Amersfoort, the Netherlands
Hello and sorry for the delay.
In order to import the RaceCapture file into DLog99, you need :
* a column with absolute time in seconds (start from 0). You can have it from the GPS time (hhmmss.000) in Excel.
* You need to only have float value (no character). You may need to delete some column
* The time column need to be in the first row
I also noticed that there can be some errors in the logging (sporadic characters). You need to delete those lines in order to import it (otherwise, the import will fail).
I will post my Excel as soon as I have my other computer.
Hope it helps
In order to import the RaceCapture file into DLog99, you need :
* a column with absolute time in seconds (start from 0). You can have it from the GPS time (hhmmss.000) in Excel.
* You need to only have float value (no character). You may need to delete some column
* The time column need to be in the first row
I also noticed that there can be some errors in the logging (sporadic characters). You need to delete those lines in order to import it (otherwise, the import will fail).
I will post my Excel as soon as I have my other computer.
Hope it helps
-
- Posts: 2
- Joined: Thu Oct 17, 2013 11:04 am
- Location: Italia
Federico, I too had some floating point errors. I think neoraptor left out quite a few details unless of course I'm mistaken about how strict Dlog99 is on its log format. After lots of playing with a perl script and Dlog99 (countless retries, ugh) I think found this additional critical tidbit:
- Dlog99 requires every field have a floating value (assuming you use the float import option, otherwise integer). If you have any field blank Dlog99 throws the error:
" is not a valid floating point value.
The quote in that string isn't a typo. Weird and not a very helpful error message
So I wrote a perl script that:
- modifies the headers so they only have the first value, like "Speed" or "AccelX"
- moves the time column to the first column (just swaps with whatever column was originally in the first as I don't think the order of the other columns is important)
- Makes the start of the log time 0, and subsequent values are adjusted relevant to that time and converted to seconds (current time format is HHMMSS.xxx where xxx is seconds in milliseconds. I just convert the HHMMSS to epoch.xxx and then subtract the first time from all subsequent times).
- Fills in the blanks. Since the all lines may have some or no data points, I initialize any data points in the first data line that are empty to zero. I then step through each line and each data point in that line and update it if blank to whatever the last known value was for the column, and if not empty I update the last known value to the current data point. This basically fills in the blanks with last known values, making Dlog99 happy.
That's pretty much it.
I'll gladly share it if people want it. It needs a couple perl modules that are pretty easy to find (trivial with Cygwin's installer).
I used the output file from the script in Dlog99 by importing it, and then save in Dlog99 as an .stf file, which I can then open in GEMS Data Analysis.
- Dlog99 requires every field have a floating value (assuming you use the float import option, otherwise integer). If you have any field blank Dlog99 throws the error:
" is not a valid floating point value.
The quote in that string isn't a typo. Weird and not a very helpful error message
So I wrote a perl script that:
- modifies the headers so they only have the first value, like "Speed" or "AccelX"
- moves the time column to the first column (just swaps with whatever column was originally in the first as I don't think the order of the other columns is important)
- Makes the start of the log time 0, and subsequent values are adjusted relevant to that time and converted to seconds (current time format is HHMMSS.xxx where xxx is seconds in milliseconds. I just convert the HHMMSS to epoch.xxx and then subtract the first time from all subsequent times).
- Fills in the blanks. Since the all lines may have some or no data points, I initialize any data points in the first data line that are empty to zero. I then step through each line and each data point in that line and update it if blank to whatever the last known value was for the column, and if not empty I update the last known value to the current data point. This basically fills in the blanks with last known values, making Dlog99 happy.
That's pretty much it.
I'll gladly share it if people want it. It needs a couple perl modules that are pretty easy to find (trivial with Cygwin's installer).
I used the output file from the script in Dlog99 by importing it, and then save in Dlog99 as an .stf file, which I can then open in GEMS Data Analysis.
I just noticed something odd which might mean I don't have this conversion quite right. The data rate for Speed I noticed is 10hz my rc log. However after conversion (where it's still 10hz) and import into Dlog99 the displayed data rate drops to exactly 1hz. No idea why right now. I imagine other data has this issue too. Sigh...
When you import your file into Excel, do you have only numbers (not text) ?Federico Caporale wrote:I try following your indication but it's Dlog99 just don't reconize the values... I even try to export the demo file (Dlog99 create a .txt file) but it gives to me the same error:
' 0.000' is not a valid floating point value
do you have some idea?
If you have a number preceded with a space, it is considered as text and not number.
Thus the import in dlog fails.
@jpf11: could you sent me your perl script? I would like to try it with my data set to see if it is also working.
Do you delete the line not valid?
Example of wrong data (raw data): I often have some lines like this one.
"TPS"|"%"|10,"RPM"|"RPM"|1,"AccelX"|"G"|30,"AccelY"|"G"|30,"AccelZ"|"G"|30,"Roll"|"Deg/Sec"|30,"Latitude"|"Deg"|10,"Longitude"|"Deg"|10,"Speed"|"kmh"|10,"Time"|"sec"|10,"GpsSats"|""|10
99.30,,-0.626,1.596,2.499,32.836,48.515594,6.648700,64.08,121300.398,9,,1.739,-1.211,-2.471,-3.412,,,,,,,-2.208,-2.376,1.729,2.559,,,,,
100.00,156657,-1.932,-0.430,-1.879,17.697,48ÿÿñ5575ì6ÿ6487ò8¬65.48,±2ÿ300ÿ500,9,,2.485,1.630,-2®438,-8.955,,,,,,,-1.397,1.904,2.499,47.335,,,,,
99.65,,-2.459,-2.477,2.499,13.006,48.515553,6.648756,67.24,121300.602,9,,-1.856,-2.418,1.679,-3.625,,,,,,,-1.719,-2.071,2.216,-59.062,,,,,
Neo - just sent you my current perl script (which now has the time fixed I think) as well as my rc log I was testing with.
I think my script actually will cover those weird non-number data quality issues by overwritting non-number values in the script with the previous number value for that column.
As noted in my pm, I found that data rates appear to get reduced greatly in Dlog99 for many if not all data, including "Speed" as mentioned in my last post but also others like "AccelX", "AccelY"... I didn't do an exhaustive check of this but it's a problem. I'm not sure what the issue is at the moment. Are you seeing this?
I think my script actually will cover those weird non-number data quality issues by overwritting non-number values in the script with the previous number value for that column.
As noted in my pm, I found that data rates appear to get reduced greatly in Dlog99 for many if not all data, including "Speed" as mentioned in my last post but also others like "AccelX", "AccelY"... I didn't do an exhaustive check of this but it's a problem. I'm not sure what the issue is at the moment. Are you seeing this?
So I might have fixed all my issues, but I'm not entirely sure why
I made a change to my rc pro config. I had some hairbrained idea that maybe it was the time resolution that was causing Dlog99 to behave funny with my import. I upped Time to 100hz in my config. I also made lots of script changes (which I'll go into in a sec) but using my new script on my old 1hz log and a new log with 100hz time the 1hz log still is very screwy in Dlog99 with only 1hz resolution and screwy values that are way off what's in the logs.
In the script here's what I changed:
- Found an issue with time. Time as I already stated has to be in HHMMSS.xxx (where xxx is milliseconds). Dlog99 of course needs the time in seconds, relative to start time. My script did not handle rolling over at midnight where the time goes from 235959.xxx to 0.xxx. Basically this made my script barf. I got lucky and happened to have a log do this and I caught it. So I had to modify my script to handle this situation and add 24 hours of seconds to the converted time. I cheated a bit and only cover this scenario up to 10am the next day, assuming no one is doing more than 10 hours of logging (though if necessary a neverending conversion could be written, just more of a pain in my butt to do that now).
- the above fix had me clean up a bunch of other things, just some housekeeping.
If anyone wants to try it out, I've attached it as I think it's good enough for others to see. Please don't judge the coding though... I'm not a developer. I completely hacked this together. It uses Text::CSV, Scalar::Util, and Time::Piece which were easy to get through Cygwin's installer. If people need help setting up Cygwin I can post a quick explanation on how to use it. Linux users most likely already know how to get these modules Just run it, import the output file in Dlog99 (use the Float option in the import window), save the .stf file when Dlog99 prompts , and then open the .stf in GEMS DA.
Now I did find one issue, possible bug, in my RC Pro's log. I found an instance of time going backwards on on tick. I forget which log I saw it in but basically I had something like 234054.1 for a number of tickets, and then one line with a time of 234054 (no milliseconds), and the next line returning to 234054.1. Odd. I reported it to Brent. I don't think this behavior is very common but also don't know what impact it might have if any on things. I don't think it will effect my script, but thought I would share this tidbit with everyone.
I made a change to my rc pro config. I had some hairbrained idea that maybe it was the time resolution that was causing Dlog99 to behave funny with my import. I upped Time to 100hz in my config. I also made lots of script changes (which I'll go into in a sec) but using my new script on my old 1hz log and a new log with 100hz time the 1hz log still is very screwy in Dlog99 with only 1hz resolution and screwy values that are way off what's in the logs.
In the script here's what I changed:
- Found an issue with time. Time as I already stated has to be in HHMMSS.xxx (where xxx is milliseconds). Dlog99 of course needs the time in seconds, relative to start time. My script did not handle rolling over at midnight where the time goes from 235959.xxx to 0.xxx. Basically this made my script barf. I got lucky and happened to have a log do this and I caught it. So I had to modify my script to handle this situation and add 24 hours of seconds to the converted time. I cheated a bit and only cover this scenario up to 10am the next day, assuming no one is doing more than 10 hours of logging (though if necessary a neverending conversion could be written, just more of a pain in my butt to do that now).
- the above fix had me clean up a bunch of other things, just some housekeeping.
If anyone wants to try it out, I've attached it as I think it's good enough for others to see. Please don't judge the coding though... I'm not a developer. I completely hacked this together. It uses Text::CSV, Scalar::Util, and Time::Piece which were easy to get through Cygwin's installer. If people need help setting up Cygwin I can post a quick explanation on how to use it. Linux users most likely already know how to get these modules Just run it, import the output file in Dlog99 (use the Float option in the import window), save the .stf file when Dlog99 prompts , and then open the .stf in GEMS DA.
Now I did find one issue, possible bug, in my RC Pro's log. I found an instance of time going backwards on on tick. I forget which log I saw it in but basically I had something like 234054.1 for a number of tickets, and then one line with a time of 234054 (no milliseconds), and the next line returning to 234054.1. Odd. I reported it to Brent. I don't think this behavior is very common but also don't know what impact it might have if any on things. I don't think it will effect my script, but thought I would share this tidbit with everyone.
- Attachments
-
- rccvs2gems.pl.txt
- v1.1 of my RC Pro CVS 2 GEMS (well Dlog99) format. parameters are "inputfilename outputfilename".
- (6.14 KiB) Downloaded 3593 times