Blur shader crash

So i’m trying to set up a sample-dynamic blur shader which takes externally generated gaussian kernals and uses them to apply a vertical blur (1st out of a 2-pass system). But whenever it draws it crashes.

The idea is it takes a pre-generated array of kernels, then in accordance with a given diameter iterates across image adding the pixels by a multiple of the appropriate kernel. (ala. normal gaussian blur shader stuff)

The environment is openGL ES 2.0 i believe.

The Code (it does compile):

varying vec2 v_texcoord;
varying vec4 v_colour;

uniform vec2 resolution;
uniform int blur_diameter;
uniform float kernels[63];

void main()
{ 
	float blurSize = 1.0/resolution.y;
	vec4 sum = vec4(0.0);
	
	float yStart = v_texcoord.y - floor(float(blur_diameter)/2.0)*blurSize;
	for(int i = 0; i < blur_diameter; i++)
	{
		sum += texture2D(gm_BaseTexture, vec2(v_texcoord.x, yStart + float(i)*blurSize)) * kernels[i];
	}
	
	sum = smoothstep(0.0,1.0,sum);
	gl_FragColor = v_colour * vec4(sum.rgb,1.0);
}

for context, i’ve tested this a bunch and this is the data for a particular instance:
blur_diameter = 5
kernels = {0.192077, 0.203914, 0.2080190.203914, 0.192077}
resolution = vec2(966, 536)

any help would be much appreciated, i’m attempting to use basic uniform arrays since its just a 1D of numbers, it should never be higher than 63 and im too lazy to set up sample/vao buffers. If this method just isnt compatible or is too crap let me know and ill look into those.

Edit: I wont bump this since it is still on the front page and near the top, but some help would very much be appreciated.

This is solved god damn it.

I was passing blur_diameter as a float instead of an int.

delete this, report me and end my life pls.