My idea was to
1) linearly interpolate between current input sample and preceding one (dividing the interval in N points)
2) "do stuff" for every interpolated value (calculating a polynomial, in my case)
3) sum all the results from all interpolations and divide by N
This actually reduces aliasing a bit, but not really that much, however increasing a lot the dsp load..
Any tips for clevererer techniques?
Edit: also sometimes i fail to understand the discrete math notation (with all those Z[t-1] and stuff), is there someting fairly easy i could read to sort that out? Wikipedia is full of crazy dsp math, but it's like ancient aramaic to me