hello again, i have been running a program of mine for several months now and it appears to be working the way i want it to, it calculates all of its entry and exit points from the close of each candle, but i have a question, i have been backtesting the program using 10,000 as default share size with $50 of slippage (half a cent per trade) to account for the fact that the candle will close on the bid 50% of the time and the ask 50% of the time, what i am wondering is whether or not that amount of slippage is accurate, how much slippage should i be using to accurately represent real life? (my program is set to run as a black box, therefore would execute instantaneously, and my trading platform sends its orders as market on its entry and exit)