I have my dad’s computer set up on my network so that we can share internet and my nephews can play some games over the LAN when they come over. He lives about 250 feet from my house (my house is almost directly behind his). I have an 8 dbi antenna in my window in my computer room that is attached to my router and at his house I have a Linksys 54g game adapter in his window. The adapter is then attached to an IOGear powerline network adapter that’s plugged in to the electrical outlet and in the basement there’s a 2nd powerline network adapter that his computer’s ethernet card plugs in to. I have to do it like this to get his signal upstairs (without having to run ethernet cable upstairs and drill in to walls or floors, which he is not willing to have done). One of the things I was hoping to accomplish a couple years ago when I did this was for him or my nephews to watch ISO DVDs that are stored on my computer and now I record a lot of stuff so it’d be nice if they could watch those if they wanted. After I got it set up, the playback was start and stop and erratic, and I realized I was not going to be able to play those ISO DVDs because of the high bit rate. If I try playing a divx or xvid file, then it will play fine. Now that you know the set up and what I want out of it, here’s my question…what is a bigger issue, distance or signal strength? The signal strength I can get from his house is anywhere from 45% to 65%. So, for example if I got a 50% signal would I be running at about 27 mb/s? Now, I know that 54 mb/s is only a “theoretical” speed and that is not the actual speed. I have also seen charts for distance and what their speed is, for example at 1 foot = 54 mb/s, 300 ft = 1 mb/s. So, if I could get my signal strength at 100% would I still only be able to achieve a certain speed, regardless of the full strength? Or, if I could achieve 100% from that distance then would I run at full speed? I’m not sure if I’ve made this clear; if not, then let me know I will try to clarify what I’m asking.