I'm having a problem with finding the correct angle so i can rotate a image to face a point. (See image attached for example). In my program there is a arrow in the middle of the screen (900x900). I have all the code written to rotate the image correctly to a certain angle (if I pass 90 to it it'll rotate correctly to 90 degrees if I pass 270 it'll rotate to that.. etc). The problem I having is that I can't seem to get the correct angle to pass to my function. I am trying to find the angle realitive to the center of the screen (450,450)(red dots in picture). My code is on my development computer so I can't copy & paste it, but I'll explain my process with some code. First I multiple the magnitude of both points and divide the dot product of the points by that value.

// Example code (Not actual Code) Point p1 = new Point(450,450); Point p2 = new Point(900,900); value = dotProduct(p1,p2) / (getMagnitude(p1) * getMagnitude(p2));

Then I use -cos to get the line angle in radians then convert it into a angle;

radian = Math.Acos(value); angle = radian * (180 / Math.PI);

The angle I recieve from performing the code above is 90. Which isn't what I need nor is it right. On a normal grid shouldn't that be 45 degrees (p2 would be up and to the right on the grid)? On the screen wouldn't it be 135 (top left corner = 0,0 & bottom right is 900,900 so the line would angle down and to the right from the center).

I'm not really sure what is the problem. I can only assume I am going about it the wrong way. Anybody have any ideas?

#### Attached image(s)

This post has been edited by **Donpa**: 04 October 2011 - 05:57 AM