I assume the device bitrate setting (less quality to best quality) determines the bitrate the the video is sent from the server to the device. However, I've had trouble seeing any difference when playing with the values. I also assume that this same setting will determine the bitrate of a pre-transcoded file. Again, I've had trouble seeing the difference. I have a Sony BDP-S590 and it can display the bitrate as it is playing a video, and I haven't seen it change after changing the device bitrate setting on the server. Am I doing something wrong? Am I wrong in assuming that my payer will show a different bitrate based on the setting on the server? Also, is there a way to configure it to pre-Transcode only video's whose bitrates are natively over a certain threshold?

Thanks.