7 Replies - 1665 Views - Last Post: 22 November 2012 - 12:59 PM Rate Topic: -----

#1 JL29  Icon User is offline

  • D.I.C Head

Reputation: 3
  • View blog
  • Posts: 50
  • Joined: 20-May 10

[C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 21 November 2012 - 07:43 AM

So I'm creating a md5 model loader in opengl at the moment and I noticed something rather strange...

When using glUseProgram(x) more then one time my memory slowly eats away. When I use the same program troughout the runtime of the program memory stabilizes at 60.768kb, but when I use glUseProgram(x) the memory keeps growing albeit very slowly. After 10 minutes it sits at ~85mb and 5 minutes later at 90mb and keeps it growing...

Is it a bug in OpenGL or my video drivers? Searching on google I have found one, and only ONE person with the same problem and he thinks it is the cause of his video card being an ATI card.

http://devgurus.amd.com/thread/145380


My video chip: ATI HD7470M


Does anyone have an explanation for this and/or a possible fix? What I did now whas create an uniform variable and some functions and I change the uniform to determine what shader to use, but this creates an ugly übershader...

Is This A Good Question/Topic? 0
  • +

Replies To: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

#2 anonymous26  Icon User is offline

  • D.I.C Lover

Reputation: 1
  • View blog
  • Posts: 3,638
  • Joined: 26-November 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 21 November 2012 - 08:33 AM

In your link provided they called it a memory leak. I don't think that is what it is because we don't know if the memory is lost when the shader object is released. Anyway moving on to your problem, there is nowhere near enough info to point out what is going on beyond the fact that the way that the function is being used is likely to be causing some kind of recursive behavior causing memory to be eaten up rapidly with each call.

I would restructure your code to avoid repeated calls to the same code.
Was This Post Helpful? 0
  • +
  • -

#3 JL29  Icon User is offline

  • D.I.C Head

Reputation: 3
  • View blog
  • Posts: 50
  • Joined: 20-May 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 21 November 2012 - 10:13 AM

Thanks for the reply,

The problem could be indeed that function causing that sort of behaviour allthough I would not get why.
An example of the structure I use that causes that "Leak":
Mainloop
  Rendermodel
     ----> glUseProgram(shaderprog0);
           rendermeshes
           if(renderskeleton)
               glUseProgram(shaderprog1); << Leaks here
               renderskeleton
   Rendertext
     ----> if(usedprogram != shaderprog1)
               glUseProgram(shaderprog1); << Leaks here
               rendertext
EndMainLoop


when I create an uniform like I said in the first post and check that value to determine the function I'll use in the shader there is no leak whatsoever... This thing is driving me nuts because I don't like ugly shaders :nono:/>

This post has been edited by JL29: 21 November 2012 - 10:14 AM

Was This Post Helpful? 0
  • +
  • -

#4 anonymous26  Icon User is offline

  • D.I.C Lover

Reputation: 1
  • View blog
  • Posts: 3,638
  • Joined: 26-November 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 21 November 2012 - 10:31 AM

Looking at the docs

http://www.opengl.or...lUseProgram.xml

This isn't how you use glUseProgram(). It appears to be called once and then any more instances are invoked using glAttachShader(). Also, the way you are checking for errors is not complete. Check the error values that can be returned from the docs I've linked you to.
Was This Post Helpful? 2
  • +
  • -

#5 JL29  Icon User is offline

  • D.I.C Head

Reputation: 3
  • View blog
  • Posts: 50
  • Joined: 20-May 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 21 November 2012 - 11:51 AM

aha! Thank you! I'll see if it works tomorrow :)
Was This Post Helpful? 0
  • +
  • -

#6 anonymous26  Icon User is offline

  • D.I.C Lover

Reputation: 1
  • View blog
  • Posts: 3,638
  • Joined: 26-November 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 21 November 2012 - 12:01 PM

Awesome. Good luck! :)
Was This Post Helpful? 0
  • +
  • -

#7 JL29  Icon User is offline

  • D.I.C Head

Reputation: 3
  • View blog
  • Posts: 50
  • Joined: 20-May 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 22 November 2012 - 01:10 AM

Trying to use glAttachShader is NOT how it should work I found out. First of all it slows the code to a crawl, second the appropriate way is by using glUseProgram :)/> Trying to figure out why it uses memory made me notice something curious. I created two shader programs: BumpProgram and ColProgram and attached two shaders to them and linked them.
Running the code WITHOUT drawing anything the memory stabilized quickly at 30.122kb, but when actually drawing something the memory suddenly started omnomnoming itself up to +inf ram (if I would let it continue). After 2 minutes the program used 450mb of ram. Trying to find out where the problem resides I have found the troublemaker, but I don't know how to fix it.

		glColor3f( 1.0f, 1.0f, 1.0f );
glEnableVertexAttribArray (tangentLoc);
		glEnableClientState( GL_VERTEX_ARRAY );
		glEnableClientState( GL_TEXTURE_COORD_ARRAY );
		glEnableClientState( GL_NORMAL_ARRAY );
 
		glActiveTexture(GL_TEXTURE0);
		glBindTexture( GL_TEXTURE_2D, mesh.m_TexID );
		glActiveTexture(GL_TEXTURE1);
		glBindTexture( GL_TEXTURE_2D, mesh.m_NormID);
		glActiveTexture(GL_TEXTURE2);
		glBindTexture( GL_TEXTURE_2D, mesh.m_SpecID);

		glVertexPointer( 3, GL_FLOAT, 0, &(mesh.m_PositionBuffer[0]) );
		glNormalPointer( GL_FLOAT, 0, &(mesh.m_NormalBuffer[0]) );
		glTexCoordPointer( 2, GL_FLOAT, 0, &(mesh.m_Tex2DBuffer[0]) );

		
		glVertexAttribPointer(tangentLoc, BumpProgram, GL_FLOAT, GL_FALSE, 0, &mesh.m_TangentBuffer[0]);

 
		glDrawElements( GL_TRIANGLES, mesh.m_IndexBuffer.size(), GL_UNSIGNED_INT, &(mesh.m_IndexBuffer[0]) ); //THIS is the troublemaker
 
		glDisableClientState( GL_NORMAL_ARRAY );
		glDisableClientState( GL_TEXTURE_COORD_ARRAY );
		glDisableClientState( GL_VERTEX_ARRAY );
		glDisableVertexAttribArray (tangentLoc);


It is glDrawElements! But when I stop switching shader programs it stops eating up memory... how is this possible? When I comment it out it stops eating memory too. Does it have something to do with the way I pass the indexbuffer?
Was This Post Helpful? 0
  • +
  • -

#8 anonymous26  Icon User is offline

  • D.I.C Lover

Reputation: 1
  • View blog
  • Posts: 3,638
  • Joined: 26-November 10

Re: [C++/OpenGL/GLSL] glUseProgram(x) eats up memory?

Posted 22 November 2012 - 12:59 PM

No idea. You need to find example code that works.
Was This Post Helpful? 0
  • +
  • -

Page 1 of 1