Servo Moves Upon Plugin. Sometimes to 90 and stops. Sometimes to 180 and back to 0 - servo

Newbie here. When I plug the USB into my Arduino to power the project, my servo wants to move around right away. Sometimes it goes to 90 degrees and stops until it is triggered by the ultrasonic sensor. In this case, it starts at 90, goes back to 0, to 180 back to 0...
The other case is that upon powering it up, sometimes the servo goes to 180 and then back to 0. Neither of these options are good because I am making a lock on a box and I can't have it unlock as soon as I power up the Arduino. I need the servo to remain at 0 until I trigger the sensor.
(More mods have to be made to the code so that the lock will remain open until I trigger it again but that is another discussion).
Your help is much appreciated!
#include<Servo.h>
int trig=8;
int echo=9;
int dt=10;
Servo servo;
//int distance,duration;
void setup() {
// put your setup code here, to run once:
pinMode(trig,OUTPUT);
pinMode(echo,INPUT);
Serial.begin(9600);
servo.attach(11);
}
void loop() {
// put your main code here, to run repeatedly:
if (calc_dis()<5)
{
for (int i=0;i<=540;i++)
{
servo.write(i);
delay(1);
}
delay(100);
for (int i=540;i>=0;i--)
{
servo.write(i);
delay(1);
}
delay(100);
}
}
//This code is written to calculate the DISTANCE using ULTRASONIC SENSOR
int calc_dis()
{
int duration,distance;
digitalWrite(trig,HIGH);
delay(dt);
digitalWrite(trig,LOW);
duration=pulseIn(echo,HIGH);
distance = (duration/2) / 29.1;
return distance;
}

Related

STM32 - Reading I2S to record a .WAV file. Audio choppy, what is causing it?

I'm using an STM32 (STM32F446RE) to receive audio from two INMP441 mems microphone in an stereo setup via I2S protocol and record it into a .WAV on a micro SD card, using the HAL library.
I wrote the firmware that records audio into a .WAV with FreeRTOS. But the audio files that I record sound like Darth Vader. Here is a screenshot of the audio in audacity:
if you zoom in you can see a constant noise being inserted in between the real audio data:
I don't know what is causing this.
I have tried increasing the MessageQueue, but that doesnt seem to be the problem, the queue is kept at 0 most of the time. I've tried different frame sizes and sampling rates, changing the number of channels, using only one inmp441. All this without any success.
I proceed explaining the firmware.
Here is a block diagram of the architecture for the RTOS that I have implemented:
It consists of three tasks. The first one receives a command via UART (with interrupts) that signals to start or stop recording. the second one is simply an state machine that walks through the steps to write a .WAV.
Here the code for the WriteWavFileTask:
switch(audio_state)
{
case STATE_START_RECORDING:
sprintf(filename, "%saud_%03d.wav", SDPath, count++);
do
{
res = f_open(&file_ptr, filename, FA_CREATE_ALWAYS|FA_WRITE);
}
while(res != FR_OK);
res = fwrite_wav_header(&file_ptr, I2S_SAMPLE_FREQUENCY, I2S_FRAME, 2);
HAL_I2S_Receive_DMA(&hi2s2, aud_buf, READ_SIZE);
audio_state = STATE_RECORDING;
break;
case STATE_RECORDING:
osDelay(50);
break;
case STATE_STOP:
HAL_I2S_DMAStop(&hi2s2);
while(osMessageQueueGetCount(AudioQueueHandle)) osDelay(1000);
filesize = f_size(&file_ptr);
data_len = filesize - 44;
total_len = filesize - 8;
f_lseek(&file_ptr, 4);
f_write(&file_ptr, (uint8_t*)&total_len, 4, bw);
f_lseek(&file_ptr, 40);
f_write(&file_ptr, (uint8_t*)&data_len, 4, bw);
f_close(&file_ptr);
audio_state = STATE_IDLE;
break;
case STATE_IDLE:
osThreadSuspend(WAVHandle);
audio_state = STATE_START_RECORDING;
break;
default:
osDelay(50);
break;
Here are the macros used in the code for readability:
#define I2S_DATA_WORD_LENGTH (24) // industry-standard 24-bit I2S
#define I2S_FRAME (32) // bits per sample
#define READ_SIZE (128) // samples to read from I2S
#define WRITE_SIZE (READ_SIZE*I2S_FRAME/16) // half words to write
#define WRITE_SIZE_BYTES (WRITE_SIZE*2) // bytes to write
#define I2S_SAMPLE_FREQUENCY (16000) // sample frequency
The last task is the responsible for processing the buffer received via I2S. Here is the code:
void convert_endianness(uint32_t *array, uint16_t Size) {
for (int i = 0; i < Size; i++) {
array[i] = __REV(array[i]);
}
}
void HAL_I2S_RxCpltCallback(I2S_HandleTypeDef *hi2s)
{
convert_endianness((uint32_t *)aud_buf, READ_SIZE);
osMessageQueuePut(AudioQueueHandle, aud_buf, 0L, 0);
HAL_I2S_Receive_DMA(hi2s, aud_buf, READ_SIZE);
}
void pvrWriteAudioTask(void *argument)
{
/* USER CODE BEGIN pvrWriteAudioTask */
static UINT *bw;
static uint16_t aud_ptr[WRITE_SIZE];
/* Infinite loop */
for(;;)
{
osMessageQueueGet(AudioQueueHandle, aud_ptr, 0L, osWaitForever);
res = f_write(&file_ptr, aud_ptr, WRITE_SIZE_BYTES, bw);
}
/* USER CODE END pvrWriteAudioTask */
}
This tasks reads from a queue an array of 256 uint16_t elements containing the raw audio data in PCM. f_write takes the Size parameter in number of bytes to write to the SD card, so 512 bytes. The I2S Receives 128 frames (for a 32 bit frame, 128 words).
The following is the configuration for the I2S and clocks:
Any help would be much appreciated!
Solution
As pmacfarlane pointed out, the problem was with the method used for buffering the audio data. The solution consisted of easing the overhead on the ISR and implementing a circular DMA for double buffering. Here is the code:
#define I2S_DATA_WORD_LENGTH (24) // industry-standard 24-bit I2S
#define I2S_FRAME (32) // bits per sample
#define READ_SIZE (128) // samples to read from I2S
#define BUFFER_SIZE (READ_SIZE*I2S_FRAME/16) // number of uint16_t elements expected
#define WRITE_SIZE_BYTES (BUFFER_SIZE*2) // bytes to write
#define I2S_SAMPLE_FREQUENCY (16000) // sample frequency
uint16_t aud_buf[2*BUFFER_SIZE]; // Double buffering
static volatile int16_t *BufPtr;
void convert_endianness(uint32_t *array, uint16_t Size) {
for (int i = 0; i < Size; i++) {
array[i] = __REV(array[i]);
}
}
void HAL_I2S_RxHalfCpltCallback(I2S_HandleTypeDef *hi2s)
{
BufPtr = aud_buf;
osSemaphoreRelease(RxAudioSemHandle);
}
void HAL_I2S_RxCpltCallback(I2S_HandleTypeDef *hi2s)
{
BufPtr = &aud_buf[BUFFER_SIZE];
osSemaphoreRelease(RxAudioSemHandle);
}
void pvrWriteAudioTask(void *argument)
{
/* USER CODE BEGIN pvrWriteAudioTask */
static UINT *bw;
/* Infinite loop */
for(;;)
{
osSemaphoreAcquire(RxAudioSemHandle, osWaitForever);
convert_endianness((uint32_t *)BufPtr, READ_SIZE);
res = f_write(&file_ptr, BufPtr, WRITE_SIZE_BYTES, bw);
}
/* USER CODE END pvrWriteAudioTask */
}
Problems
I think the problem is your method of buffering the audio data - mainly in this function:
void HAL_I2S_RxCpltCallback(I2S_HandleTypeDef *hi2s)
{
convert_endianness((uint32_t *)aud_buf, READ_SIZE);
osMessageQueuePut(AudioQueueHandle, aud_buf, 0L, 0);
HAL_I2S_Receive_DMA(hi2s, aud_buf, READ_SIZE);
}
The main problem is that you are re-using the same buffer each time. You have queued a message to save aud_buf to the SD-card, but you've also instructed the I2S to start DMAing data into that same buffer, before it has been saved. You'll end up saving some kind of mish-mash of "old" data and "new" data.
#Flexz pointed out that the message queue takes a copy of the data, so there is no issue about the I2S writing over the data that is being written to the SD-card. However, taking the copy (in an ISR) adds overhead, and delays the start of the new I2S DMA.
Another problem is that you are doing the endian conversion in this function (that is called from an ISR). This will block any other (lower priority) interrupts from being serviced while this happens, which is a bad thing in an embedded system. You should do the endian conversion in the task that reads from the queue. ISRs should be very short and do the minimum possible work (often just setting a flag, giving a semaphore, or adding something to a queue).
Lastly, while you are doing the endian conversion, what is happening to audio samples? The previous DMA has completed, and you haven't started a new one, so they will just be dropped on the floor.
Possible solution
You probably want to allocate a suitably big buffer, and configure your DMA to work in circular buffer mode. This means that once started, the DMA will continue forever (until you stop it), so you'll never drop any samples. There won't be any gap between one DMA finishing and a new one starting, since you never need to start a new one.
The DMA provides a "half-complete" interrupt, to say when it has filled half the buffer. So start the DMA, and when you get the half-complete interrupt, queue up the first half of the buffer to be saved. When you get the fully-complete interrupt, queue up the second half of the buffer to be saved. Rinse and repeat.
You might want to add some logic to detect if the interrupt happens before the previous save has completed, since the data will be overrun and possibly corrupted. Depending on the speed of the SD-card (and the sample rate), this may or may not be a problem.

detect pulse objective c

At the moment i'm creating an iOS app which is visualizing port status of an arduino. Therefor the iPad is receiving information via Serial Cable from Arduino.
The Arduino sends every 100ms a package with it's current port status. This status is visualized on the iPad.
The Ports are input Ports. I've recognized that the device i'm reading is pulsing the ports so the Arduino reads high low level alternating. That creates flickering in the visualization.
My question is now how to detect if the level is up or the input is flickering.
The port is high for x seconds get low for y seconds and after that it repeats. If the port is low for z seconds i need to set the port as low in the visualization. Otherwise it is high.
- (void) readBytesAvailable:(UInt32)numBytes {
int bytesRead = [manager read:rxBuffer Length:numBytes];
if(rxBuffer[i]==48){
[self setButtonRed];
}else if(rxBuffer[i]==49){
[self setButtonWhite]
}
}
https://www.dropbox.com/s/bhy5lbm8lkdhnoy/3wire.png?dl=0
If I understood well the scenario is this: you want to determine if the output is alternating its state or it is stuck at ground. You didn't specify the period/up-low time nor the number of pins, so I assume that you have four buttons connected to arduino pins 1,2,3,5 and I'll use literals.
You'll have to set CHECK_PERIOD to an appropriate sampling period so that you can sample the input 4/5 times for every state, and CHECK_ITERATIONS so that you can also miss some points.
For instance if the normal wave is 100ms high and 100ms low, I'd set CHECK_PERIOD to 20 and CHECK_ITERATIONS to, let's say, 3 or 4.
long previousInputCheck;
#define NUM_INPUTS 4
const int inputPins[] = { 1, 2, 3, 5}
unsigned char inputCounter[NUM_INPUTS];
unsigned char inputStates[NUM_INPUTS];
... THEN, INTO THE MAIN ...
if ((millis() - previousInputCheck) >= CHECK_PERIOD)
{
previousInputCheck += CHECK_PERIOD;
unsigned char i;
for (i = 0; i < NUM_INPUTS; i++)
{
if (digitalRead(inputPins[i]) == LOW)
{
if (inputCounter[i] <= CHECK_ITERATIONS)
inputCounter[i]++;
if (inputCounter[i] == CHECK_ITERATIONS)
{
inputStates[i] = LOW;
}
}
else
{ // HIGH
inputCounter[i] = 0;
inputStates[i] = HIGH;
}
}
}

Processing Open CV Face Tracking

Hi I started with the code vom Sparkfun below to do some Face Tracking and get this error:
the type OpenCV is ambiguous
I tried other examples from the OpenCV for Processing library.
And they work without a problem (Also the Face Tracking Example)
The original code from Sparkfun was written for a different OpenCV (Version 1 I believe).
But I could not make it work, because there is no import library at the top of the code.
Since I have OpenCV for Processing installed I imported that:
import gab.opencv.*;
and from then on I get this error.
I don't see why it does not work and I don't understand why it was supposed to work (since it does not import OpenCV in the orginal code).
Any help would be great.
Thanks.
/**********************************************************************************************
* Pan/Tilt Face Tracking Sketch
* Written by Ryan Owens for SparkFun Electronics
* Uses the OpenCV real-time computer vision framework from Intel
* Based on the OpenCV Processing Examples from ubaa.net
* This example is released under the Beerware License.
* (Use the code however you'd like, but mention us and by me a beer if we ever meet!)
*
* The Pan/Tilt Face Tracking Sketch interfaces with an Arduino Main board to control
* two servos, pan and tilt, which are connected to a webcam. The OpenCV library
* looks for a face in the image from the webcam. If a face is detected the sketch
* uses the coordinates of the face to manipulate the pan and tilt servos to move the webcam
* in order to keep the face in the center of the frame.
*
* Setup-
* A webcam must be connected to the computer.
* An Arduino must be connected to the computer. Note the port which the Arduino is connected on.
* The Arduino must be loaded with the SerialServoControl Sketch.
* Two servos mounted on a pan/tilt backet must be connected to the Arduino pins 2 and 3.
* The Arduino must be powered by a 9V external power supply.
*
* Read this tutorial for more information:
**********************************************************************************************/
import gab.opencv.*;
import hypermedia.video.*; //Include the video library to capture images from the webcam
import java.awt.Rectangle; //A rectangle class which keeps track of the face coordinates.
import processing.serial.*; //The serial library is needed to communicate with the Arduino.
OpenCV opencv; //Create an instance of the OpenCV library.
//Screen Size Parameters
int width = 320;
int height = 240;
// contrast/brightness values
int contrast_value = 0;
int brightness_value = 0;
Serial port; // The serial port
//Variables for keeping track of the current servo positions.
char servoTiltPosition = 90;
char servoPanPosition = 90;
//The pan/tilt servo ids for the Arduino serial command interface.
char tiltChannel = 0;
char panChannel = 1;
//These variables hold the x and y location for the middle of the detected face.
int midFaceY=0;
int midFaceX=0;
//The variables correspond to the middle of the screen, and will be compared to the midFace values
int midScreenY = (height/2);
int midScreenX = (width/2);
int midScreenWindow = 10; //This is the acceptable 'error' for the center of the screen.
//The degree of change that will be applied to the servo each time we update the position.
int stepSize=1;
void setup() {
//Create a window for the sketch.
size( width, height );
opencv = new OpenCV( this );
opencv.capture( width, height ); // open video stream
opencv.cascade( OpenCV.CASCADE_FRONTALFACE_ALT ); // load detection description, here-> front face detection : "haarcascade_frontalface_alt.xml"
println(Serial.list()); // List COM-ports (Use this to figure out which port the Arduino is connected to)
//select first com-port from the list (change the number in the [] if your sketch fails to connect to the Arduino)
port = new Serial(this, Serial.list()[0], 57600); //Baud rate is set to 57600 to match the Arduino baud rate.
// print usage
println( "Drag mouse on X-axis inside this sketch window to change contrast" );
println( "Drag mouse on Y-axis inside this sketch window to change brightness" );
//Send the initial pan/tilt angles to the Arduino to set the device up to look straight forward.
port.write(tiltChannel); //Send the Tilt Servo ID
port.write(servoTiltPosition); //Send the Tilt Position (currently 90 degrees)
port.write(panChannel); //Send the Pan Servo ID
port.write(servoPanPosition); //Send the Pan Position (currently 90 degrees)
}
public void stop() {
opencv.stop();
super.stop();
}
void draw() {
// grab a new frame
// and convert to gray
opencv.read();
opencv.convert( GRAY );
opencv.contrast( contrast_value );
opencv.brightness( brightness_value );
// proceed detection
Rectangle[] faces = opencv.detect( 1.2, 2, OpenCV.HAAR_DO_CANNY_PRUNING, 40, 40 );
// display the image
image( opencv.image(), 0, 0 );
// draw face area(s)
noFill();
stroke(255,0,0);
for( int i=0; i<faces.length; i++ ) {
rect( faces[i].x, faces[i].y, faces[i].width, faces[i].height );
}
//Find out if any faces were detected.
if(faces.length > 0){
//If a face was found, find the midpoint of the first face in the frame.
//NOTE: The .x and .y of the face rectangle corresponds to the upper left corner of the rectangle,
// so we manipulate these values to find the midpoint of the rectangle.
midFaceY = faces[0].y + (faces[0].height/2);
midFaceX = faces[0].x + (faces[0].width/2);
//Find out if the Y component of the face is below the middle of the screen.
if(midFaceY < (midScreenY - midScreenWindow)){
if(servoTiltPosition >= 5)servoTiltPosition -= stepSize; //If it is below the middle of the screen, update the tilt position variable to lower the tilt servo.
}
//Find out if the Y component of the face is above the middle of the screen.
else if(midFaceY > (midScreenY + midScreenWindow)){
if(servoTiltPosition <= 175)servoTiltPosition +=stepSize; //Update the tilt position variable to raise the tilt servo.
}
//Find out if the X component of the face is to the left of the middle of the screen.
if(midFaceX < (midScreenX - midScreenWindow)){
if(servoPanPosition >= 5)servoPanPosition -= stepSize; //Update the pan position variable to move the servo to the left.
}
//Find out if the X component of the face is to the right of the middle of the screen.
else if(midFaceX > (midScreenX + midScreenWindow)){
if(servoPanPosition <= 175)servoPanPosition +=stepSize; //Update the pan position variable to move the servo to the right.
}
}
//Update the servo positions by sending the serial command to the Arduino.
port.write(tiltChannel); //Send the tilt servo ID
port.write(servoTiltPosition); //Send the updated tilt position.
port.write(panChannel); //Send the Pan servo ID
port.write(servoPanPosition); //Send the updated pan position.
delay(1);
}
/**
* Changes contrast/brigthness values
*/
void mouseDragged() {
contrast_value = (int) map( mouseX, 0, width, -128, 128 );
brightness_value = (int) map( mouseY, 0, width, -128, 128 );
}
You're using two separate Processing wrappers for OpenCV(gab.* and hypermedia.*), both having an OpenCV class of they're own. Use one or the other but not both in the same project. Java can't tell which one you want to use(hence the ambiguous OpenCV type error)
You seem to be using the hypermedia classes anyway, so remove the gab.* import for now as a quick fix.
The gab.* library though is better (more up to date) than the hypermedia one, so you might want to update you OpenCV calls to use that one in the future.

Distance between 2 arduino's using rf links

I currently have a setup where I send a char using a Tx of 434MHz and an Uno to a Mega with a Rx. The Mega counts how many times it receives the char and then if it falls below a certain number it triggers an alarm. Is this a viable way to measure the distance between two microcontrollers while indoors or is there a better way.
Transmitter (Mega)
#include <SoftwareSerial.h>
int rxPin=2; //Goes to the Receiver Pin
int txPin=5; //Make sure it is set to pin 5 going to input of receiver
SoftwareSerial txSerial = SoftwareSerial(rxPin, txPin);
SoftwareSerial rxSerial = SoftwareSerial(txPin, rxPin);
char sendChar ='H';
void setup() {
pinMode(rxPin, INPUT);
pinMode(txPin,OUTPUT);
txSerial.begin(2400);
rxSerial.begin(2400);
}
void loop() {
txSerial.println(sendChar);
Serial.print(sendChar);
}
Receiver
#include <SoftwareSerial.h>
//Make sure it is set to pin 5 going to the data input of the transmitter
int rxPin=5;
int txPin=3; //Don't need to make connections
int LED=13;
int BUZZ=9;
int t=0;
char incomingChar = 0;
int counter = 0;
SoftwareSerial rxSerial = SoftwareSerial(rxPin, txPin);
void setup() {
pinMode(rxPin, INPUT); //initilize rxpin as input
pinMode(BUZZ, OUTPUT); //initilize buzz for output
pinMode(LED, OUTPUT); //initilize led for output
rxSerial.begin(2400); //set baud rate for transmission
Serial.begin(2400); //see above
}
void loop() {
for(int i=0; i<200; i++) {
incomingChar = rxSerial.read(); //read incoming msg from tx
if (incomingChar =='H') {
counter++; //if we get bit "h" count it
}
delay(5); //delay of 10 secs
}
Serial.println(incomingChar);
Serial.println(counter); //prints the the bits we recieved
if(counter<55) {
//if we receive less than 100 bits than print out of range triggers alarm
Serial.println("out of range");
tone(BUZZ,5000,500);digitalWrite(LED,HIGH);
}
else {
noTone(BUZZ);digitalWrite(LED, LOW);
//if we get more than 100 bits than we are within range turn off alarm
Serial.println("in range");
}
counter = 0;
incomingChar=0;
}
In theory you could achieve distance measuring by making the uno send a message which the mega would echo back. That would give the uno a round-trip time for message propagation between the arduinos. You would have to approximate the processing delays. After that it is basic physics. That is basically the same as how radar works. The actual delay would be something like
troundtrip = tuno send + 2*tpropagation + tmega receive + tmega send + tuno receive
I am guessing the distance you are trying to achieve is in the order of meters. Required resolution is going to be an issue, because s = vt => t = s/v, where s is the distance between your arduinos and v = c in case of radio waves. As the transmission delays should stay constant, you have to be able to measure differences in the order of 1/c second intervals, basically. I am not very familiar with arduinos, so I do not know if they are capable of this kind of measurements.
I would suggest you use an ultrasonic range finder like the Maxbotix HRLV-EZ4 sold by Sparkfun.
It is within your price range and it should be able to measure distances up to 5m/195 inches with 1mm resolution.
It is actually possible to do it, I have seen it be done with other microcontrollers. Therefore using arduino you would have to solve the equations,fit in arduino language and make a lot of measurements to value discrepancies over communication itself. Do not forget about atmospheric attenuation wich need to be known and fit in the equations. Humidity may deviate electromagnetic waves.

How to correctly calculate FPS in XNA?

I wrote a component to display current FPS.
The most important part of it is:
public override void Update(GameTime gameTime)
{
elapseTime += (float)gameTime.ElapsedRealTime.TotalSeconds;
frameCounter++;
if (elapseTime > 1)
{
FPS = frameCounter;
frameCounter = 0;
elapseTime = 0;
}
base.Update(gameTime);
}
public override void Draw(GameTime gameTime)
{
spriteBatch.Begin();
spriteBatch.DrawString(font, "FPS " + ((int)FPS).ToString(), position, color, 0, origin, scale, SpriteEffects.None, 0);
spriteBatch.End();
base.Draw(gameTime);
}
In most cases it works ok, but recently I had a problem.
When I put following code into Update method of game strange thing starts to happen.
if (threadPath == null || threadPath.ThreadState != ThreadState.Running)
{
ThreadStart ts = new ThreadStart(current.PathFinder.FindPaths);
threadPath = new Thread(ts);
threadPath.Priority = ThreadPriority.Highest;
threadPath.Start();
}
Main idea of this code is to run pathFinding algorithm in different thread all the time.
By strange things I mean that sometimes FPS drasticly decreases, this is obvious, but displayed FPS changes more often than once a second. If I understand this code FPS can't change more often than once a second.
Can someone explain me what's going on?
Edit 26.03.2010
I've posted also code of Draw method.
Edit 31.03.2010 Answers to Venesectrix questions
1) are you running with a fixed or variable time step?
IsFixedTimeStep and SynchronizeWithVerticalRetrace is set to true.
2)Were you moving the XNA window around when this occurred?
No
3)Did the XNA window have focus?
Yes
4) How noticeable was it (ie, updating so fast you can't read it, or just barely updating more than a second)?
I was able to read updates, FPS was updating ~3 times a second.
5) And all of this only happens with the thread code in there?
Yes
Shawn Hargreaves has a great post about this here. The first difference I see between his code and yours is the fact that you reset your elapseTime to 0 each time, which will lose some time, whereas Shawn just subtracts 1 second from his elapsedTime. Also, Shawn uses ElapsedGameTime instead of ElapsedRealTime. He updates his frameCounter in the Draw function instead of the Update function as well.
As far as why he uses ElapsedRealTime, he explains it in a comment after the post:
> Surely 1 / gameTime.ElapsedRealTime.TotalSeconds
> will therefore give the current framerate.
That will tell you how long it was
since the previous call to Update, but
that is not the same thing as your
framerate!
a) If the game is dropping frames,
Update will be called more frequently
in order to catch up. You want to time
the number of actual draws that are
taking place, not just these extra
catch-up logic frames.
b) The time for a single Update can
fluctuate widely, so the figure you
get out of that will be too flickery
to be easily readable.
I would try his component, and see if it works for you. The post is pretty old, and I think you will have to change LoadGraphicsContent to LoadContent and UnloadGraphicsContent to UnloadContent, as another one of the comments points out.
Here is how I do it and with this method:
You average over n Frames
You can use it with any initialization method you choose
It should be easy to read and follow
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Audio;
using Microsoft.Xna.Framework.Content;
using Microsoft.Xna.Framework.GamerServices;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;
using Microsoft.Xna.Framework.Media;
namespace _60fps
{
public class Game1 : Microsoft.Xna.Framework.Game
{
GraphicsDeviceManager graphics;
SpriteBatch spriteBatch;
SpriteFont OutputFont;
float Fps = 0f;
private const int NumberSamples = 50; //Update fps timer based on this number of samples
int[] Samples = new int[NumberSamples];
int CurrentSample = 0;
int TicksAggregate = 0;
int SecondSinceStart = 0;
public Game1()
{
graphics = new GraphicsDeviceManager(this);
Content.RootDirectory = "Content";
}
protected override void Initialize()
{
base.Initialize();
graphics.SynchronizeWithVerticalRetrace = false;
int DesiredFrameRate = 60;
TargetElapsedTime = new TimeSpan(TimeSpan.TicksPerSecond / DesiredFrameRate);
}
protected override void LoadContent()
{
spriteBatch = new SpriteBatch(GraphicsDevice);
OutputFont = Content.Load<SpriteFont>("MessageFont");
}
protected override void UnloadContent()
{/* Nothing to do */}
protected override void Update(GameTime gameTime)
{
if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed || Keyboard.GetState(PlayerIndex.One).IsKeyDown(Keys.Escape))
this.Exit();
base.Update(gameTime);
}
private float Sum(int[] Samples)
{
float RetVal = 0f;
for (int i = 0; i < Samples.Length; i++)
{
RetVal += (float)Samples[i];
}
return RetVal;
}
private Color ClearColor = Color.FromNonPremultiplied(20, 20, 40, 255);
protected override void Draw(GameTime gameTime)
{
Samples[CurrentSample++] = (int)gameTime.ElapsedGameTime.Ticks;
TicksAggregate += (int)gameTime.ElapsedGameTime.Ticks;
if (TicksAggregate > TimeSpan.TicksPerSecond)
{
TicksAggregate -= (int)TimeSpan.TicksPerSecond;
SecondSinceStart += 1;
}
if (CurrentSample == NumberSamples) //We are past the end of the array since the array is 0-based and NumberSamples is 1-based
{
float AverageFrameTime = Sum(Samples) / NumberSamples;
Fps = TimeSpan.TicksPerSecond / AverageFrameTime;
CurrentSample = 0;
}
GraphicsDevice.Clear(ClearColor);
spriteBatch.Begin();
if (Fps > 0)
{
spriteBatch.DrawString(OutputFont, string.Format("Current FPS: {0}\r\nTime since startup: {1}", Fps.ToString("000"), TimeSpan.FromSeconds(SecondSinceStart).ToString()), new Vector2(10,10), Color.White);
}
spriteBatch.End();
base.Draw(gameTime);
}
}
}
As for:
"but the question why displayed FPS was changing more often than once a second is still open"
The difference between ElapsedGameTime and ElapsedRealTime is that "ElapsedGameTime" is the amount of time since the last time you entered the Update or Draw statement (depending on which "gameTime" you're using - the one from Update or the one from Draw).
ElapsedRealTime is the time since the game started. Because of this, it increases linearly as the game continues to run. Indeed, after 1 second, you'll update every frame because your logic looked like this:
(Let's assume you were running 4 fps for the sake of easy explanation):
Frame 1: ElapsedRealTime: 0.25. Running total now: 0.25
Frame 2: ElapsedRealTime: 0.5 Running total now: 0.75
Frame 3: ElapsedRealTime: 0.75 Running total now: 1.5
Running total greater than 1!!! Show FPS!
Set Running total = 0
Frame 4: ElapsedRealTime: 1.00 Running total now: 1.0
Running total greater than 1!!! Show FPS!
Now that you've fixed the counter, you should now only be getting Elapsed Game Time changes of a steady 0.25 so the progression now moves:
Frame 1: ElapsedGameTime: 0.25. Running total now: 0.25
Frame 2: ElapsedGameTime: 0.25 Running total now: 0.50
Frame 3: ElapsedGameTime: 0.25 Running total now: 0.75
Frame 4: ElapsedGameTime: 0.25 Running total now: 1.00
Running total greater than 1!!! Show FPS!
Set Running total = 0
Frame 5: ElapsedGameTime: 0.25. Running total now: 0.25
Frame 6: ElapsedGameTime: 0.25 Running total now: 0.50
Frame 7: ElapsedGameTime: 0.25 Running total now: 0.75
Frame 8: ElapsedGameTime: 0.25 Running total now: 1.00
Running total greater than 1!!! Show FPS!
Set Running total = 0
Which is what you're expecting. In short, now that you corrected the first problem, you should have corrected the second too and "why" is explained above.
As an aside ... you should avoid setting the thread priority. By assigning the highest thread priority to what should be a background thread, you could end up starving the main thread of cpu time because the scheduler would give priority to threadPath
Are you actively checking if IsRunningSlowly is being changed? Even with IsFixedTimeStep to true, if your program isn't able to do as many Updates as it expects, it will call it more frequently.
A way I have mitigated this before is by directly calling ResetElapsedTime() instead of keeping track of it yourself.
Not sure if that'll work for you though. I did notice that when i was debugging the previous issue I had, it wouldn't call the extra Updates, probably a 'feature' when debugging.
You should run your fps counter under the draw method
Hi guys if you want to show your real framerate you have to implement framerate counter to method Draw, because XNA do it like this:
"If your computer can't serve the method Update it suspend Draw method and instead of that it serve the Update method"

Resources