-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Timers periodic interrupt - no documentation available, help #1095
Comments
I recognize your confusion, I had exactly the same. What you need to realize is that when not using Setup and Loop you are skipping the core initialization, which sets the CPU prescaler clock to your desired clock speed. So you have to do the CPU clock prescaling yourself. Try with adding below F_CPU_init() function to the bottom of your sketch and call it first thing in your main(void)
[edit] O, and for millis(), micros() and delay() functions to work, you also need the core initialization. And in the tools menu of the IDE you can select which timer you want Millis() to use. |
Thank you for your suggestion. I wasn’t particularly worried about the setup you mentioned because I am compiling through PlatformIO, which has already got a way of setting the Clock freq. I will anyway try your method and report back. Still, I feel quite confused about what do the instructions I used do really. I would like to find a way to get to an actual formula to estimate precisely the amount of time between each ISR call. Timing is critical in my project , and even if it “seems” to work, I want to be sure I’m estimating time correctly. |
Looking again at your code it's probably not going to make a difference, as you don't use the 16/20MHz oscillator
Here you select the 32.768 KHz oscillator as the main system clock, and you prescale it with a factor 64. That would mean a 512 Hz clocked CPU. That's very different from the MHz speeds more commonly used. [edit] So in fact what you are doing is prescaling the 20 or 16MHz oscillator by 64. Your Clock_init would also override the CPU_init that I provided in my previous post, so that is not working anymore. |
So I think your 400ms comes from: 20MHz (oscillator) / 64 (CLKCTRL_PDIV_64X_gc) / 2 (TCB_CLKSEL_CLKDIV2_gc ) / 65535 (TCB_TIMEOUT_VALUE) = 2.3 Hz |
Does CLKCTRL_MCLKCTRLB change the system clock and CLKCTRL_MCLKCTRLA change only the timers clock source? Because i would like to be able to have control over both the overall system clock (for performance and consumption) and timers clock (to evaluate timing precisely) Right now I have reached 1 second of delay by changing from the code you have seen above.
Although now I know how to evaluate timing (kinda of, thanks to you) I am not so sure any longer about the main system clock. PS: also I see on the datasheet that the clock source for the TCBx Timer can be chosen between two different sources (?) |
The Attiny412 has 3 clock sources. 1a) The 16 MHz oscillator The choice between 1a) and 1b) is fuse driven, so cannot be selected in software. This is why it is usually called the 16/20 MHz oscillator. You can choose between 1 or 2 in software, Please study 10.2.1 Block Diagram - CLKCTRL carefully and look at the Other Peripherals, as that's where TimerB also get's it clock source from. From the main clock prescaler. |
Hello there. I am trying to port some code from an Attiny45 over to an Attiny412.
The code on the Attiny45 made use of an "Output Compare Match Interrupt" with TCNT0 registers and that's quite well documented on how it works and how to use it.
I haven't found such a clear documentation for such a use on the new Attiny series unfortunately, so I am asking for some help here.
In order to better understand these "new" registers I am trying to build a very basic example:
An ISR function is toggling an LED between On/Off periodically. I want to be able to determine and manipulate its rate (period, timing).
Right now, after looking at various documentation and places I laded onto the following code:
With the above code I get an LED rate which seems to be around 400ms. Although I'm not getting why! I would like to understand how to manipulate these registers to get precisely to a desired timing (for example, 10ms).
A few things I noticed:
• Decreasing the TCB_TIMEOUT_VALUE obviously decreases the time interval at which the ISR gets called;
• Assigning a smaller Prescaler at CLKCTRL_MCLKCTRLB = CLKCTRL_PDIV_64X_gc | CLKCTRL_PEN_bm decreases the time interval;
• The following instruction seems to do nothing: CLKCTRL_MCLKCTRLA = CLKCTRL_CLKSEL_OSCULP32K_gc;
• Using an advised datasheet assignation to get into Periodic Timeout Mode makes the code not work at all anymore: TCB0_CTRLB = TCB_CNTMODE_TIMEOUT_gc;
• Flashing with a lower clock speed (20Mhz is the Default one, I tried decreasing to any sort of value to slow the rate of the LED) doesn't affect the LED timing at all!
• With this instruction I have three clock options, but I don't understand what they really do.
I would like to understand what is the criteria to preserve correctly the millis(), micros() and delay() functions if I am changing the Timers.
Thank you guys in advance!
The text was updated successfully, but these errors were encountered: