Excellent--this setup works perfectly! In the areas I was concentrating on, the the speed increased an order of magnitude.
However, the overall speed seems to have dropped. I believe this may be because the heavy indexing that follows on the result is slower in numpy. Is this a correct analysis? What would be amazing is if I could port the other half of the algorithm to numpy too . . . The algorithm is for light volume extrusion of geometry. Pseudocode of the rest of the algorithm: 1 v_array #array of beautifully transformed vertices from last step 2 n_array #array of beautifully transformed normals from last step 3 f_list #list of vec4s, where every vec4 is three vertex indices and a 4 #normal index. These indices together describe a triangle-- 5 #three vertex indices into "v_array", and one normal from "n_array". 6 edges = [] 7 for v1index,v2index,v3index,nindex in f_list: 8 v1,v2,v3 = [v_array[i] for i in [vi1index,v2index,v3index]] 9 if angle_between(n_array[nindex],v1-a_constant_point)<90deg: 10 for edge in [[v1index,v2index],[v2index,v3index],[v3index,v1index]]: 11 #add "edge" to "edges" 12 #remove pairs of identical edges (also note that things like 13 #[831,326] are the same as [326,831]) 14 edges2 = [] 15 for edge in edges: 16 edges2.append(v_array[edge[0]],v_array[edge[1]]) 17 return edges2 If that needs clarification, let me know. The goal with this is obviously to remove as many looping operations as possible. I think the slowdown may be in lines 8, 9, and 16, as these are the lines that index into v_array or n_array. In line 9, the normal "n_array[nindex]" is tested against any vector from a vertex (in this case "v1") to a constant point (here, the light's position) see if it is less than 90deg--that is, that the triangle's front is towards the light. I thought this might be a particularly good candidate for a boolean array? The edge pair removal is something that I have worked fairly extensively on. By testing and removing pairs as they are added (line 11), a bit more performance can be gained. I have put it here in 12-13 because I'm not sure this can be done in numpy. My current algorithm works using Python's standard sets, and using the "edge" as a key. If an edge is already in the set of edges (in the same or reversed form), then the edge is removed from the set. I may be able to do something with lines 14-17 myself--but I'm not sure. If my code above can't be simplified using numpy, is there a way to efficiently convert numpy arrays back to standard Python lists? As mentioned before, I'm certainly no pro with numpy--but I am learning! Thanks! Ian
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion