Maximum time of xdk timer period?
Answer
2/8/18 9:08 AM

Hi, Community

 

Recently, I'm testing a timer period.

 

I made a define of MILLISECOND function

#define MILLISECONDS(x) ((portTickType) x / portTICK_RATE_MS)

 

and I setting the timer 

  xTimerHandle timerHandle = xTimerCreate(
                (const char * const) "My Timer",
                MILLISECONDS(1),
                pdTRUE,
                NULL,
                myfunction
                );

 

and I used xTaskGetTickCount() to make sure the timer worked for the period I set.

10 milliseconds, and 5 milliseconds, it was confirmed that the intervals of the return values of the xTaskGetTickCount() function were constant.

However, at 1 millisecond, the interval between the return values of the xTaskGetTickCount() function was sometimes 2 milliseconds apart.

What is the maximum value of the period that can be obtained from the original XDK, in particular the period of stability?

0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/8/18 11:35 AM as a reply to Byun SangKyu.
Hi Byun welcome to the XDK community.

In the theoretical view the resolution of the soft timers built on freertos kernel will no less than the tick rate configured on configTICK_RATE_HZ on FreeRTOSConfig.h, this is because the softtimers are clocked by the xIncrementTick() function of FR called in the systick timer interrupt. So in your case if the ticker runs every interrupt per milisecond (typical case of ticker configuration on FreeRTOS) the timers firing has a limit of 1ms.

In case of the Jitter you are experiencing, could be caused by a series of factors, the timer callback firing are executed from a task context, the ticker defers this timer task which roll over the current pending timers evaluating if one of them was expired so calling its callback. Since this timer task has a priority associated with it, when lower its value, the real time reponse of the timer will be strongly dependant of another XDK tasks (not only user application task, but xdk system tasks such command processor, ble or wifi service) causing delays on timer response which expiration are near of theoretical limit.

My suggestion:

- Try to increase the timer task priority in FreeRTOSConfig.h file (hit ctrl+shift+r and on the box type this file name on XDK IDE)

- After trigger a timer expiration, get the freertos tick count immediately after exit from xTimerStart and in callback subtract this value from a second reading of tick count value to ensure if this a jitter problem.

Let me know if this was helpful to you.

Felipe
0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/8/18 5:14 PM as a reply to Felipe Neves.
Hello Byun,

what Felipe pointed out is largely collect. In summary, timers are all called and managed within the context of a task (which is called the daemon task ). You can read more on this topic here .

Because of this, there is overhead whenever a timer's function has to be called.

To optimize the periodicity, and to eliminate jitter, I would actually recommend that you use a task with a for-loop instead of a timer. Using tasks, you can maintain quite elaborate control on the periodicity using the function vTaskDelayUntil() . You can find documentation and an example on using this function here .

This should help in getting a stable 1ms period using the operating system. Keep in mind that, as Felipe pointed out, the current duration between two ticks is configured as 1ms, which also determines the maximum frequency of 1kHz.

Please tell me if this was helpful, and do not hesitate to ask further questions.

Kind regards,
Franjo
+1 (1 Vote)
RE: Maximum time of xdk timer period?
Answer
2/13/18 9:13 AM as a reply to Franjo Stjepandic.

Thank you Felipe, 

 

than How to increase the timer task priority?

 

I founded FreeRTOSConfig.h file, and change configTICK_RATE_HZ 1000 to 2000

Is need more change configuration?

 

thanks

0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/13/18 9:20 AM as a reply to Byun SangKyu.

Thank you Franjo,

 

I tried to use follow function

 

 void vTaskFunction( void * pvParameters )
 {
 TickType_t xLastWakeTime;
 const TickType_t xFrequency = 10;

     // Initialise the xLastWakeTime variable with the current time.
     xLastWakeTime = xTaskGetTickCount();

     for( ;; )
     {
         // Wait for the next cycle.
         vTaskDelayUntil( &xLastWakeTime, xFrequency );

         // Perform action here.
     }
 }
  

I was able to confirm that the interval of each timer tick occurred at 1ms.

When I printed sensor of data until 10 seconds, theoretically 10000 data must be printed.

However, I put the syntax for sampling acceleration, gyro, and noise sensor data into a for loop so that I can only measure one half of the data(about 4500 sampled data).

I only transmit data to the PC.

I implemented this program.

static void vTaskFunction(void)
{
    // Block for 1ms.
    TickType_t xLastWakeTime;
    const TickType_t xFrequency = 1;
    xLastWakeTime = xTaskGetTickCount();
    for( ;; )
    {
        vTaskDelayUntil( &xLastWakeTime, xFrequency );
        if(startSave){
            Accelerometer_readXyzGValue(xdkAccelerometers_BMA280_Handle,&bma280);
                        Gyroscope_readXyzDegreeValue(xdkGyroscope_BMG160_Handle,&bmg160);
                        Acoustic = BSP_Mic_AKU340_Sense();
                        count++;
        printf("%d,%.0f,%.0f,%.0f,%ld,%ld,%ld,%ld\n",
                count,
                (float)bma280.xAxisData, (float)bma280.yAxisData, (float)bma280.zAxisData,
                (long int)bmg160.xAxisData/1000,(long int)bmg160.yAxisData/1000,(long int)bmg160.zAxisData/1000,
                (unsigned long)Acoustic
                );

        }
    }
}

Is it due to the delay that occurs during sampling?

In other words, it means that the cycle does not reach 1ms. What should I do in this case?

 

0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/13/18 2:47 PM as a reply to Byun SangKyu.
Hello Byun,

First of all, yes, using the reading functions within the task's loop increases the overall delay between two iterations of the task. This is unavoidable, since reading the sensor values also requires some processing time.

The biggest issues here is the fact that you are using printf() in every iteration. This is an intensive blocking operation, requiring more or less a constant time of 1ms per iteration in your code. I/O is the bottleneck in most systems and the XDK is no exception to this.

What I would recommend to do is remove the printf from the iteration, and buffer the data instead. Then, you should only push the data when the buffer's capacity is filled. I have tested this and I can confirm that the entire execution within 10 seconds took exactly 10000 ticks, using the code that you provided in the post.

Printing every single line multiplies the number of ticks within 10 seconds by a factor of 2,3. On the other hand, only printing when the internal buffer of printf() reaches its limit, the number of ticks is multiplied by the factor 1,33.

Overall, it really depends on your use case. If you want to print every single set of data instantly, you will not be able to get above 500Hz. This raises the question, what is your use-case? Are you intending to send the data via WiFi or store it on an SD card?

Regarding your other questions, the timer priority can be set using
 
#define  configTIMER_TASK_PRIORITY   1 // your priority here (1 to 4)

within the file SDK > xdk110 > Common > config > FreeRTOSConfig.h

On the other hand, changing the tick rate should not have a significant effect, especially if you only have one task in your entire application. During every tick, the kernel assigns a different task to the CPU. If you only have one task, all it does is generating unneccessary overhead, which is even more increased if the tick rate is made higher.

It is important to understand that the FreeRTOS tick rate is not equivalent to the XDK's internal clock frequency. The tick rate is just a software tick rate, set to a 1 ms duration in the config by default.

Please tell me if this was helpful, and do not hesitate to ask further questions.

Kind regards,
Franjo
0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/14/18 1:52 AM as a reply to Franjo Stjepandic.

Thank you Franjo, 

 

I create a 1ms fixed cycle timer through the sensor and try to stream it in real time. (For virtual and augmented reality simulator)

The SD card is not used in the sensor. The first step is to transfer the data via USB serial communication.
Secondly, it will use WIFI UDP (or TCP) communication.

Because of this, the printf is continuously printed inside the for loop.

 

I do not understand using buffers.

Does the buffer refer to the software buffer?

ex) string buffer [10000] = {};

But as far as I know, the array size declaration in the sensor is known to be limited by the memory limit.

Also, can the sensor not exceed 500 Hz (2 ms) to print a single dataset immediately?

If I could make the timer cycle less than 1 ms (microseconds), would it be possible to overcome it?

Thank you.

0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/14/18 3:16 PM as a reply to Byun SangKyu.
Hello Byun,

I analyzed in more detail on how to speed up the process of sending data through USB. For that, I used the function USB_transmitData() from the interface BCDS_USB.h . This is effectively the same as printf() .

I tested this, while only sending the data from the accelerometer via USB. Every single iteration (out of 1000 in total) took 1.4 ticks. With your configuration, a factor of 1.8 for the ticks has been achieved. This is equal to 1.8 ms.

As far as I can tell, this is the maximum you can achieve when you are trying to send data from the accelerometer, gyroscope, the acoustic sensor and the counter at the same time.

Regarding whether lowering the timer cycle duration (to microsecond levels) would be useful, I would like to say again, that the bottleneck of the entire application is the USB transmission of data. Increasing the tick rate of FreeRTOS would not solve the issue.

The only solution would be to find a way to asynchronously (i.e. in a non-blocking way) send the data. The WiFi-chip allows for asynchronous data transmission via UDP, but you would definitely not be able to send one set of data in every tick. You would have to first accumulate some data, then send it in one go.

Sending data via UDP will be much slower than using USB. Using a buffer would be a must, then. A buffer is basically an array that has more space than you immediately require, where you can accumulate data. And as you mentioned, allocating large arrays requires careful planning. But, storing a dataset of 1000 integers should be easily possible. You may have to adjust the stack size of your task for that.

In summary I would like to say that achieving a 1kHz frequency that involves retrieving sensor data and sending it is a complicated use case, and a solution is not easily achieved.

Please let me know if that was helpful and feel free to ask if you have further questions.

Kind regards,
Franjo
0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/18/18 11:07 PM as a reply to Franjo Stjepandic.

Hello Franjo,

 

Thank you for comment,

 

Is it limited in firmware of XDK sensor implemented in BCDS? Or is there a limitation in the hardware itself?

 

So, Is it possible to modify firmware in BCDS?

Or, It is ok to purchase a BMA sensor and implementing independently?

I can use another Microcontroller.

Thank you

0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/19/18 5:14 PM as a reply to Byun SangKyu.
Hello Byun,

it really depends on your use case.

In general, transmitting sensor data is always slower than just reading it. If you only have one sensor, 1kHz can be possible. But if you add other sensors, you require more time to read the data, and you also need more time to transmit the data. You will face this issue with any product and product combination.

Do note, that this has barely anything to do with the FreeRTOS operating system. Using only one task is as efficient as it could get. Even when turning off preemptive scheduling, the speed up is very minimal.

As such, the limitation really depends on how much you want. Given the fact that UDP is as minimal as it gets, regarding conventional internet data transmission protocols, you may achieve 1kHz if you only send one sensor's data. Especially, since the simplelink chip can send in a non-blocking fashion. Achieving 1kHz with more than one sensor would require very thorough optimization.

You could, of course, alternatively purchase another microcontroller, add a BMA280, a BMG160 and an AKU340 to it. But please note, that you would have to spend a lot of time in researching an optimal solution and connect the components with each other on your own.

The XDK on the opposite is an already integrated sensor node, where you do not need to worry about how to attach the needed sensors and their respective libraries and the electrical behavior. There are certainly some limitations given such a design, but their impact depends on the use case.

And I think, the impact on this use case is quite minimal. As I often mentioned, data transmission, especially via Radio, is usually the main bottleneck in an application.

I hope this clarifies it more straightforward. Perhaps you could try implementing UDP in your application and verify the performance.

Please let me know if that was helpful and feel free to ask if you have further questions.

Kind regards,
Franjo
0 (0 Votes)
RE: Maximum time of xdk timer period?
Answer
2/19/18 11:31 PM as a reply to Franjo Stjepandic.

Thank you for your help. Franjo!


I studied a lot because of you.


Have a nice day

0 (0 Votes)