Jump to content
OMRON Forums

Execute move every n triggers


andyf

Recommended Posts

I have a timing card that generates a pulse every millisecond. I need to trigger a motor to move every n milliseconds. I was planning to purchase a timer/counter card to count the pulses and then generate a trigger pulse to the Delta Tau digital i/o card. However, I am curious if the PPMAC can do the counting accurately...perhaps in a PLC....or if there is another feature of the PPMAC that might help in this scenario.
Link to comment
Share on other sites

  • Replies 13
  • Created
  • Last Reply

Top Posters In This Topic

If you have a spare hardware servo channel, you can use the pulse frequency modulation (PFM) feature of the channel to very accurately specify a pulse interval. This can be fed back to the channel's counter inside the ASIC (Gate1.Chan[j].EncCtrl = 8) and some software task can read the counter value to decide whether to trigger motion. In this case the hardware generation probably has far more accuracy than the software can use.

 

You could also just monitor status element Sys.ServoCount, which incrments each servo cycle, and decide when enough servo cycles have elapsed for you to trigger your next move. Since the software that will command the move operates on the servo cycle, this is probably enough accuracy.

Link to comment
Share on other sites

Thanks Curt. This post is related to my other post on clock accuracy. We have other systems in the observatory (i.e. cameras) that are synced to the same time reference system. I am looking at two options:

 

1) Have the time system hardware generate a single pulse at the precise start of an observation, and let the delta tau sequence moves relative to that start. The problem here is that if the observation runs for a long time (i.e. 8hours), I am concerned the clock accuracy at 50ppm (either CPU or axis) will drift and things will get out of sync. We can only tolerate a few ms of drift over 8 hours.

 

2) Have the time system hardware generate a pulse every millisecond from the start of an observation. The delta tau would count these pulses and execute moves at the appropriate times relative to the start. I've written a RTI PLC that accurately counts the pulses (tested for 16 hours straight), but I'm not sure if there is a more slick way to do this.

Link to comment
Share on other sites

We do have an automatic feature that does what you need. Many of our observatory users implement it. It's called "external time base" and it is explained in the User's Manual chapter "Synchronizing Power PMAC to External Events". In this mode, Power PMAC gets its time sense for trajectory updates from counting the external pulses coming in, not from its own clock crystal. A quick summary:

 

1. Feed the pulse signal into an encoder input channel A input. For you, set up the decode for this channel to "pulse and direction" (= 0 or 4, whichever causes the counter to count up. It is best to have a signal frequency at least an order of magnitude higher than the servo frequency to reduce quantization noise that can cause jitter. Since this is hardware, we can accept frequencies into the MHz range.

 

2. Process the count value in the Encoder Conversion Table with "1/T" sub-count extension entry (for smoothness). This entry should be present by default. Set the scale factor of the entry so that when the input frequency is at its ideal nominal value, the change in pulse count per servo cycle (found in EncTable[n].DeltaPos) is the number of milliseconds in a servo cycle at its ideal nominal frequency (which should be the value in Sys.ServoPeriod).

 

3. Tell your coordinate system(s) to use this value for the time base by setting Coord[x].pDesTimeBase to EncTable[n].DeltaPos.a

 

4. Write your motion program trajectory assuming that everything is operating at the ideal nominal frequencies. PMAC will automatically handle any deviations. If the true servo frequency is slightly higher than nominal, the value of DeltaPos will be slightly less than nominal, so the trajectory will increment a little less each servo cycle.

 

Once you have done this setup, everything happens automatically. You don't need to write any PLCs or anything to do adjustments.

Link to comment
Share on other sites

Thanks for the instructions Curt. I was able to get this working on our system. However, I am not seeing the accuracy I had hoped for. For example, my test motion program looks like this:

 

m2023 = 0

X1000 tm1000

m2023 = 1

delay 4000

m2023 = 0

X3000 tm1000

m2023 = 1

delay 4000

 

The writes to m2023 toggle an digital output pin, which I am capturing timestamps on using another piece of hardware. I would expect the move times to be 1000ms...but instead I am measuring +/-20ms of that.

 

Are there additional adjustments that can be configured or made to improve this?

Link to comment
Share on other sites

What you are measuring here is the time between computation of successive moves, not between the start of execution of these moves. +/-20ms is a little higher than I would have expected, but the key with this measurement is the long term drift. When is the accumulated time (and error) in a series of 1000 moves?

 

If you change your assignment commands to "m2023 == 0" and "m2023 == 1", the actual assignment to the output is delayed until the beginning of actual execution of the next commanded move. You should see a lot less jitter in this case -- down to the servo cycle level, or to the segment level if Coord[x].SegMoveTime > 0. Still, the true measure is not between individual moves, but in the long term -- how much accumulated error is seen.

 

One test you will definitely want to do is to vary the physical servo update time -- if you are using PMAC2 style ASICs to generate the clocks, you will vary the value of Gaten.PwmPeriod slightly. You should see no effect in the long term timing.

Link to comment
Share on other sites

OK, I see what you mean about using "==" versus "=". It turns out this was not the root of my problem though...

 

I also realized that the more dead on my EncTable[n].DeltaPos is to the Sys.ServoPeriod, the less drift I see. In fact, I was able to reduce it to a millisecond every 12 moves. Is there a formula to use for setting the ECT scale factor to achieve this? I did it through trial and error until I was as close as possible, and out of scale factor precision.

 

What is the next step to reduce this drift?

 

Input signal period 32 microseconds, 50% duty cycle

Sys.ServoPeriod = 0.44274210000000022

EncTable.ScaleFactor = 0.000123532954799107

EncTable.DeltaPos = 0.442742109999999522

 

 

Link to comment
Share on other sites

Following the examples in the User's Manual, you should set your encoder table scale factor to:

 

1 / [(2^N) * RTIF]

 

where N is the number of fractional bits of timer-based interpolation the table is producing and RTIF (real-time input frequency) is the nominal input count frequency in cts/msec.

 

With a period of 32usec, your RTIF is 31.25 cts/msec (kHz). If you are using a PMAC2 ASIC and the Type 3 software 1/T count extension, you have 9 bits of fractional count estimation, so

 

ScaleFactor = 1 / [512 * 31.25] = 0.0000625

 

It looks like your number should be twice this (0.000125). I'm not sure what hardware you are using, and if you truly have a count frequency of 31.25 cts/msec.

 

Remember that you probably don't want to be matching Sys.ServoPeriod exactly. The whole reason you are doing this is that Sys.ServoPeriod, which is set to the nominal servo clock period, is not correct to the accuracy you require.

Link to comment
Share on other sites

OK, I see where that formula is in the docs now. So, this is just a starting point though right? If I use this formula...

 

RTIF = 32 cnts/msec (I corrected this from previous reply)

Type 3 ASIC, thus 8 bits fractional count estimation (per ACC24E3 hw manual pg 23)

 

ScaleFactor = 1/(2^8 * 32) = 0.0001220703125

 

Using this value directly, my tests are about +36ms. So if I understand you correctly, from this point on, I have to run tests and tweak this ScaleFactor until I get the accuracy I need. Correct?

 

BTW, pg4 of the Sync PPMAC to External Events document states that 10 bits of fractional count are used in PMAC3-Style IC. I believe this is wrong, because using this gave me a DeltaPos that was off by a factor of 4. The ACC24E3 Hardware manual says 8-bits, which seems to work.

 

 

 

Link to comment
Share on other sites

"I have to run tests and tweak this ScaleFactor until I get the accuracy I need. Correct?"

 

No. You have a very accurate clock signal, with a much smaller frequency error than our internally generated clock signal. When you compare the performance of the two (by comparing DeltaPos to Sys.ServoPeriod), the error will be from our clock, not yours. If you were to adjust the ScaleFactor to make them match better, you would be defeating the high accuracy of your source.

 

When you say you have found an accumulated 36msec difference over some time period, that is the error you have corrected with your external time signal. That is, if you had used our internal time base over this period, there would have been 36msec of drift over this period, with a position error of 36msec times your velocity. By using your accurate external time base, you have prevented this error.

 

Thanks for catching the error in the manual. (There is a different register that holds 10 bits of fraction.)

Link to comment
Share on other sites

Thanks Curt. I've been using the motion program below to test accuracy. It generates a output pulse from the Delta Tau every second, which I then timestamp using the timing board hardware. If I run this WITHOUT using external time base I see about 1ms drift every 4 minutes. However, if I run it WITH external time base I see about 12ms drift EVERY SECOND!

 

So, I've done everything according to our discussion. I am going to take a look at the timing board hardware and see if something is off. Let me know if you think of anything else it might be on the Delta Tau side...

 

open prog 30

abs

linear

q0=0

 

while( q0==0 ) {dwell0}

 

while( q0==1 ) {

m2023 == 0

delay 1000

m2023 == 1

}

 

close

 

 

Link to comment
Share on other sites

Curt, I finally figured it out. It was Coord[x].TimeBaseSlew being set to the default 0.00001. That was limiting the time based value used in the servo cycle. Once I changed it to 1.0, as suggested on pg 12 of the manual, whalla. Now when I run the tests things look very accurate! 5 minutes have gone by and practically no drift.

 

Thanks for all your time helping me out via this post. Cheers!

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.

×
×
  • Create New...