BBeck's Profile User Rating: *****

Reputation: 710 Enlightened
Group:
Mentors
Active Posts:
1,670 (1.37 per day)
Joined:
24-April 12
Profile Views:
20,090
Last Active:
User is online 53 minutes ago
Currently:
Viewing Forum: Game Development

Previous Fields

Country:
US
OS Preference:
Windows
Favorite Browser:
Internet Explorer
Favorite Processor:
Intel
Favorite Gaming Platform:
PC
Your Car:
Who Cares
Dream Kudos:
100
Expert In:
Game Programming, XNA

Latest Visitors

Icon   BBeck is experimenting in his gaming lab. Don't be alarmed if something explodes.

Posts I've Made

  1. In Topic: xbox gamepad allegro 5

    Posted 26 Aug 2015

    I just watched a video the other night. I think it was Valve's Steam conference where they were talking about Linux and Steam. But they were talking about how unimaginably difficult it is to support all the controllers out there.

    (It doesn't seem that hard to me, especially if they all go through a standard OS model, but what do I know: I don't code for anything other than the 360 controller.)

    Anyway, they said that Microsoft's answer to this is that "everything is an XBox 360 controller". Basically, DirectX only supports the XBox 360 controller and supports it very natively (well, that and a few of their other controllers like the Kinect). But they've also made the 360 controller and drivers and such kind of proprietary. They only way I know to communicate with it is to use DirectX which is kind of ugly if you are in OpenGL or something more Unix based.

    But they were saying, I believe, that SDL has support for the 360 controller along with other controllers.

    I'll probably be getting into trying to figure out how to make that work if I can ever get my wireless Wacom graphics tablet (my mouse) working with Xubuntu. That seems to be the deal breaker for me going to Linux at this point. I spent the better part of last night trying to make it work and then went to bed without success. Wacom doesn't support Linux and let's the community support it. It's supposed to work out of the box but apparently doesn't.

    But as to what you were saying, it would not surprise me if Allegro doesn't recognize the 360 controller. The 360 controller seems to be very proprietary compared to other controllers. I've just been fortunate enough to be working in environments that support it natively like XNA and DX11. I even got Unity to run with it for the most part with a bit of work. Unity might be using SDL or something to make it work because it obviously is not using DX or it would work a lot more smoothly.
  2. In Topic: riemer's tut wrong?

    Posted 26 Aug 2015

    I might also mention that the code in my HLSL is pretty much what BasicEffect is doing. The key differences are that I have not covered fog yet and BasicEffect uses 3 directional lights instead of the one I use. (I can't imagine a single good reason to use more than one but it's not that hard to add more if you find a reason for it. Just define 3 diffuse light source vectors and repeat the code for the diffuse light source 3 times.) Other than that, my code is pretty much BasicEffect. The code I just posted above is basically BasicEffect.

    I think XNA has 5 built in shaders and the others do some other things, but BasicEffect is what you need 95% of the time.
  3. In Topic: riemer's tut wrong?

    Posted 26 Aug 2015

    The UV coordinates are part of the vertices. So, they're assigned by the modeling program, such as Blender.

    If BasicEffect displays this correctly, then you must either be altering the vertices in XNA or your shader is altering them.

    If you are using the model class and not defining your own vertex buffers, then it would likely be impossible for you to be altering the vertices.

    You might watch my texture shading video, although it like the other HLSL videos builds on all the videos that came before it. Still, other than the Blinn-Phong stuff, the texturing part is pretty straight forward. You can see the shader code I used there.

    Can you post your shader code? If you're using the model class, and BasicEffect draws it without problems, the problem is almost certainly in your shader.

    And yes, the custom Content Pipeline is one of the most difficult topics in XNA, in my opinion. To do all this stuff "properly" you may need to eventually get into that, but it's too much trouble in many cases in my opinion. Certainly you want to try and get it working first before trying to get it working with changes to the Content Pipeline because that's just a whole lot of additional complication that will just confuse things. You want to learn that separate from everything else where you can focus just on the Content Pipeline and how it works. I've spent some time on it and still find it to be pretty difficult.

    It should be noted that the UV map is not a map exactly. In your modeling program, you UV un-wrap the model and produce a texture known as the UV map. All this is is a flat photo that has the UV coordinates of the model assigned to positions in the photo, if that makes sense. So, the UV coordinates are assigned at that point to every vertex in the model to coincide with a point on that photo. In the modeling program, you can move these points along the surface of the photo and it will automatically change them in the vertex of the model. But the real UV coordinates are still in the vertices of the model, not the UV map. The UV map is just a convenient way to visualize the data in the 3D vertices of the model on a 2D plane.

    Conveniently, this UV map usually has lines drawn between these UV positions to visualize the triangles between them and the vertex positions themselves. And again, you can move them around but the real data is still stored in the vertices of the model. You can also save this photo with the vertex points and the lines drawn between them to use as a template to paint on in your paint program. But there is no real data stored in this UV map photo. It is just a visualization of the data in the vertices of that model. But because it aligns with that data, you can paint on that template and then re-import your painted photo and use it to UV wrap the model and it will perfectly match the vertices of the model and texture the model. Not that there is not really any connection between the photo you painted and the model itself other than you abided by the outline of the UV unwrap map. You can use a completely different file for the import than the one you exported; there is no connection between the two other than since you painted on the UV unwrap map, when you import your painting it will UV wrap perfectly matching the vertices of your model.

    That UV wrap map is your texture (aka color map). The UV unwrap map is generally thrown away unless you want to paint another texture for the model. It's also helpful for painting other types of maps such as specularity and such because it shows you how all the triangles of the model are laid out. But otherwise it's basically trash at that point.

    Anyway, there is no special passing of the UV coordinates. They have to be part of the vertices and they will be with a model exported into XNA. Especially if BasicEffect is drawing the texture correctly, the UV coordinates are definitely in the vertices where they belong.

    So, if all you are doing is texturing the model, it should basically be as straight forward as what is in my texturing video. If you have additional maps for normals, specularity, etc, then those textures have to be brought into the shader as well.

    But everything in those videos should fully cover using a texture with the XNA model class using a custom shader.

    Other than the fact that the model class wants to use BasicEffect rather than a custom shader, it should be relatively straight forward.

    You can literally pass anything in the tags if you organize it as an object, but the UV coordinates should never be anywhere other than part of the vertices. Since the XNA model class doesn't give you access to the vertices of the model, one question that comes to mind is "What is the vertex structure of your shader?" You don't have much flexibility on this since the XNA model class feeds this to you the way it wants it to be. In my code, I defined it as:

    struct VertexShaderInput
    {
        float4 Position : POSITION0;		//Position of the vertex.
    	float2 UV : TEXCOORD0;				//Texture coordinates of the vertex. Not used here but will be used in a later example.
    	float3 Normal : NORMAL;				//Direction the vertex faces as a 3D normalized vector.
    	//float4 Color : COLOR;				//RGBA color of the vertex which is not necessarily used with textures. Apparently not supported in XNA's model class.
    };
    
    
    


    I'm not sure what it actually is, but that worked. I think technically there's a color in there somewhere, but I didn't actually get that working. I wonder if this structure is not published anywhere. I know this works but it may not be complete. But if your shader vertex structure doesn't match what the model class is sending, there are going to be problems. And it doesn't give really good errors with HLSL, and so it may not really tell you about the problem. But the 5th and 6th floats seem to be the UV coordinates.

    You might actually try using my shader and see if that works.

    //=========================================================================================================================================
    //
    //	TEXTURED SHADING
    //
    //
    //=========================================================================================================================================
    
    //Global Parameters
    float4x4 World;
    float4x4 View;
    float4x4 Projection;
    
    float4 AmbientLightColor;				//Color applied to everything.
    float3 DiffuseLightDirection;			//Color of the directional light.
    float Padding;							//Needed for 16bit alignment.
    float4 DiffuseLightColor;				//The color of the directional light.
    float4 CameraPosition;					//Where the camera is at.
    
    texture ColorMap;
    texture NormalMap;
    texture SpecularMap;
    
    sampler2D TextureSampler = sampler_state
    {
    	texture = (ColorMap);
    	magfilter = LINEAR;
    	minfilter = LINEAR;
    	AddressU = CLAMP;
    	AddressV = CLAMP;
    };
    
    sampler2D NormalsSampler = sampler_state
    {
    	texture = (NormalMap);
    	magfilter = LINEAR;
    	minfilter = LINEAR;
    	AddressU = CLAMP;
    	AddressV = CLAMP;
    };
    
    
    sampler2D SpecularSampler = sampler_state
    {
    	texture = (SpecularMap);
    	magfilter = LINEAR;
    	minfilter = LINEAR;
    	AddressU = CLAMP;
    	AddressV = CLAMP;
    };
    
    
    struct VertexShaderInput
    {
        float4 Position : POSITION0;		//Position of the vertex.
    	float2 UV : TEXCOORD0;				//Texture coordinates of the vertex. Not used here but will be used in a later example.
    	float3 Normal : NORMAL;				//Direction the vertex faces as a 3D normalized vector.
    	//float4 Color : COLOR;				//RGBA color of the vertex which is not necessarily used with textures. Apparently not supported in XNA's model class.
    };
    
    
    struct VertexShaderOutput
    {
        float4 Position : POSITION0;
    	float3 WorldSpacePosition : TEXCOORD2;
    	//float4 Color : COLOR;
    	float3 Normal : NORMAL;
    	float2 UV : TEXCOORD0;
    };
    
    
    float4 BlinnSpecular(float3 LightDirection, float4 LightColor, float3 PixelNormal, float3 CameraDirection, float SpecularPower)
    {
    	float3 HalfwayNormal;
    	float4 SpecularLight;
    	float SpecularHighlightAmount;
    
    
    	HalfwayNormal = normalize(LightDirection + CameraDirection);
    	SpecularHighlightAmount = pow(saturate(dot(PixelNormal, HalfwayNormal)), SpecularPower);
    	SpecularLight = SpecularHighlightAmount * LightColor;
    
    	return SpecularLight;
    }
    
    
    VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
    {
        VertexShaderOutput output;
    
    	input.Position.w = 1.0f;	
        float4 worldPosition = mul(input.Position, World);
    	output.WorldSpacePosition = worldPosition;						//World position without camera conversion.
        float4 viewPosition = mul(worldPosition, View);
        output.Position = mul(viewPosition, Projection);
    
    	output.Normal = mul(input.Normal, (float3x3)World);				//Place the vertex's normal in the scene.
    	output.Normal = normalize(output.Normal);						//Normalize it if it managed to get unnormalized.
    
    	//output.Color = input.Color;										//Pass the color value of the vertex straight through.
    	output.UV = input.UV;											//Pass texture coordinates straight through.
    
        return output;
    }
    
    
    float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
    {
    	float3 LightDirection;
    	float DiffuseLightPercentage;
    	float4 OutputColor;
    	float4 SpecularColor;
    	float3 CameraDirection;	//Float3 because the w component really doesn't belong in a 3D vector normal.
    	float4 AmbientLight;
    	float4 DiffuseLight;
    	float4 Texel;
    
    
    	LightDirection = -DiffuseLightDirection;	//Normal must face into the light, rather than WITH the light to be lit up.
    	DiffuseLightPercentage = saturate(dot(input.Normal, LightDirection));	//Percentage is based on angle between the direction of light and the vertex's normal. 
    	//DiffuseLight = saturate((DiffuseLightColor * input.Color) * DiffuseLightPercentage);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0.
    	DiffuseLight = saturate(DiffuseLightColor * DiffuseLightPercentage);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0 or 0% to 100%.
    
    	CameraDirection = normalize(CameraPosition - input.WorldSpacePosition);	//Create a normal that points in the direction from the pixel to the camera.
    
    	if (DiffuseLightPercentage == 0.0f) 
    	{
    		SpecularColor  = float4(0.0f, 0.0f, 0.0f, 1.0f);
    	}
    	else
    	{
    		SpecularColor = BlinnSpecular(LightDirection, DiffuseLightColor, input.Normal, CameraDirection, 15.0f);
    	}
    
    	Texel = tex2D(TextureSampler, input.UV);
    	Texel.a = 1;
    
    	//OutputColor =  saturate((AmbientLightColor * input.Color) + DiffuseLight * DiffuseLightPercentage + SpecularColor);
    	OutputColor =  saturate((AmbientLightColor * Texel) + (DiffuseLight * Texel) + SpecularColor);
    
        return OutputColor;
    }
    
    
    technique Technique1
    {
        pass Pass1
        {
    		FillMode = Solid;	//Defaults to solid but you can make it WireFrame.
            VertexShader = compile vs_3_0 VertexShaderFunction();
            PixelShader = compile ps_3_0 PixelShaderFunction();
        }
    }
    
    
    


    It might help to strip out some of the unneeded stuff to avoid having to set up those parameters:

    //=========================================================================================================================================
    //
    //	TEXTURED SHADING
    //
    //
    //=========================================================================================================================================
    
    //Global Parameters
    float4x4 World;
    float4x4 View;
    float4x4 Projection;
    
    float4 AmbientLightColor;				//Color applied to everything.
    float3 DiffuseLightDirection;			//Color of the directional light.
    float Padding;							//Needed for 16bit alignment.
    float4 DiffuseLightColor;				//The color of the directional light.
    float4 CameraPosition;					//Where the camera is at.
    
    texture ColorMap;
    
    sampler2D TextureSampler = sampler_state
    {
    	texture = (ColorMap);
    	magfilter = LINEAR;
    	minfilter = LINEAR;
    	AddressU = CLAMP;
    	AddressV = CLAMP;
    };
    
    
    
    struct VertexShaderInput
    {
        float4 Position : POSITION0;		//Position of the vertex.
    	float2 UV : TEXCOORD0;				//Texture coordinates of the vertex. Not used here but will be used in a later example.
    	float3 Normal : NORMAL;				//Direction the vertex faces as a 3D normalized vector.
    
    };
    
    
    struct VertexShaderOutput
    {
        float4 Position : POSITION0;
    	float3 WorldSpacePosition : TEXCOORD2;
    	float3 Normal : NORMAL;
    	float2 UV : TEXCOORD0;
    };
    
    
    float4 BlinnSpecular(float3 LightDirection, float4 LightColor, float3 PixelNormal, float3 CameraDirection, float SpecularPower)
    {
    	float3 HalfwayNormal;
    	float4 SpecularLight;
    	float SpecularHighlightAmount;
    
    
    	HalfwayNormal = normalize(LightDirection + CameraDirection);
    	SpecularHighlightAmount = pow(saturate(dot(PixelNormal, HalfwayNormal)), SpecularPower);
    	SpecularLight = SpecularHighlightAmount * LightColor;
    
    	return SpecularLight;
    }
    
    
    VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
    {
        VertexShaderOutput output;
    
    	input.Position.w = 1.0f;	
        float4 worldPosition = mul(input.Position, World);
    	output.WorldSpacePosition = worldPosition;						//World position without camera conversion.
        float4 viewPosition = mul(worldPosition, View);
        output.Position = mul(viewPosition, Projection);
    
    	output.Normal = mul(input.Normal, (float3x3)World);				//Place the vertex's normal in the scene.
    	output.Normal = normalize(output.Normal);						//Normalize it if it managed to get unnormalized.
    
    	//output.Color = input.Color;										//Pass the color value of the vertex straight through.
    	output.UV = input.UV;											//Pass texture coordinates straight through.
    
        return output;
    }
    
    
    float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
    {
    	float3 LightDirection;
    	float DiffuseLightPercentage;
    	float4 OutputColor;
    	float4 SpecularColor;
    	float3 CameraDirection;	//Float3 because the w component really doesn't belong in a 3D vector normal.
    	float4 AmbientLight;
    	float4 DiffuseLight;
    	float4 Texel;
    
    
    	LightDirection = -DiffuseLightDirection;	//Normal must face into the light, rather than WITH the light to be lit up.
    	DiffuseLightPercentage = saturate(dot(input.Normal, LightDirection));	//Percentage is based on angle between the direction of light and the vertex's normal. 
    	
    	DiffuseLight = saturate(DiffuseLightColor * DiffuseLightPercentage);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0 or 0% to 100%.
    
    	CameraDirection = normalize(CameraPosition - input.WorldSpacePosition);	//Create a normal that points in the direction from the pixel to the camera.
    
    	if (DiffuseLightPercentage == 0.0f) 
    	{
    		SpecularColor  = float4(0.0f, 0.0f, 0.0f, 1.0f);
    	}
    	else
    	{
    		SpecularColor = BlinnSpecular(LightDirection, DiffuseLightColor, input.Normal, CameraDirection, 15.0f);
    	}
    
    	Texel = tex2D(TextureSampler, input.UV);
    	Texel.a = 1;
    
    	
    	OutputColor =  saturate((AmbientLightColor * Texel) + (DiffuseLight * Texel) + SpecularColor);
    
        return OutputColor;
    }
    
    
    technique Technique1
    {
        pass Pass1
        {
    		FillMode = Solid;	//Defaults to solid but you can make it WireFrame.
            VertexShader = compile vs_3_0 VertexShaderFunction();
            PixelShader = compile ps_3_0 PixelShaderFunction();
        }
    }
    
    
    


    It might be worth while to post your shader code.
  4. In Topic: XNA/Monogame 3D Collision not working pls help

    Posted 26 Aug 2015

    Spherical collision is the easiest form of collision. The formula is merely whether the distance between their centers is less than the sum of their radiuseseses (radii?). If they are closer than the sum of their radii, they have to be overlapping.

    In your case, you seem to be trying to assign the collision sphere's to the sub-meshes of the model, which is probably pretty reasonable.

    I mean, you can do a single sphere for the entire model and that's probably the easiest, especially if your model does not have sub-meshes. And you don't have to use parts of the built in model class for that.

    You can assign the sphere manually instead of trying to calculate it. Calculating it would likely involve finding the two furthest vertices from one another in the model, calculating the mid-point between them and then imagining a sphere centered at that point with a radius of half the distance between them. Since I don't know of a way to get the vertices from XNA, that may be a bit difficult.

    I would probably either assign the value manually, or use a sphere in Blender to model the collision sphere or something.

    Anyway, if you have collision sphere's assigned to every sub-mesh, then you need to use the transforms to calculate the position of the sphere in the game world.

    That copybones to transforms I believe takes all the world matrices for each sub-mesh and calculates the final world matrix for each sub-mesh by combining it with all of its parents. So, it should give the world matrix used to draw that sub-mesh including it's position, orientation, and scale.

    There's a bit of a problem there, as the collision sphere may not be at the position of the world matrix.

    You're presumably doing rigid animation here as skinned animation is a lot more difficult and requires extensive modifications to XNA. So, the world matrices for each part may be centered. The wheels of a car likely would have the wheels centered with the world matrix and the collision sphere.

    Maybe a tank. The barrel of the gun on the tank likely has it's world matrix centered at the end that connects it to the turret. A collision sphere is pretty much a bad choice here any way you slice it although it may be a good choice to do quick and dirty collision testing before using another more expensive but more accurate form of collision detection.

    But the collision sphere should probably be centered at the center of the gun barrel where it's world matrix is at the close end of the barrel. So, to calculate the position of the collision sphere, you have to get the world matrix of the model, or tank body, and multiply that times the matrix of the turret and multiply that times the matrix of the barrel and possibly even have a world matrix, or at least a position/offset, for the center of the collision sphere which would only need position and maybe scale. That's basically what the copybones is doing, I believe, short of the final part to offset the collision sphere. So, it might only go as far as giving you the final world matrix of that particular sub-mesh/bone and you might have to use that to calculate the position of the sphere relative to that.

    If it were merely a 3D position/offset, you could probably make that position a Vector4 and multiply it times the sub-mesh's world matrix to get the proper position relative to the positioned model.

    Have you tried just assigning a collision sphere center position for the whole model to each model and a radius of the collision sphere rather than trying to do collision spheres for every sub-mesh? That would probably be 1,000 times easier and depending on the model possibly give better results.
  5. In Topic: riemer's tut wrong?

    Posted 24 Aug 2015

    Did that help?

My Information

Member Title:
Here to help.
Age:
Age Unknown
Birthday:
Birthday Unknown
Gender:
Location:
Dallas, Texas, US of A, Planet Earth, Sol System, Milky Way Galaxy
Interests:
Rock music composition and performance, Grunge Music, Bebop (think Thelonios Monk), Swing (think Cab Calloway), Gaming, Astronomy, RPGs, Scuba, Sail Boats, Furniture Building, Cooking, Rocky Patel Cigars, Character Driven Dramas(like HBO's Deadwood), Story Telling (plot writing), Linguistics, Economics, Target Shooting, Electronics (don't know as much as I'd like to), all aspects of 3D game programing including music, physics, modeling, texturing, annimation, probability, AI, lighting, etc., Texas Holdem' Poker, Learning, Dogs, the love of my life (my pit bull), guns (especially 19th century black powder), putting make-up on ogres, etc.
Programming Languages:
C, C++, C#, Visual Basic, Java, Pascal, T-SQL, HTML, FoxPro, ASP.Net(very little), Assembler, Machine Code(conceptually anyway)

Contact Information

E-mail:
Private
Website URL:
Website URL  http://VirtuallyProgramming.com

Comments

Page 1 of 1
  1. Photo

    BBeck Icon

    11 Aug 2013 - 04:27
    Generally, yes. :-)
  2. Photo

    aaron1178 Icon

    10 Aug 2013 - 00:42
    You wouldn't happen to get high marks in written exams would you ;)
Page 1 of 1