Clicky

Networking is not my strong side, and we just had a placement test where the question was "what is the maximum theoretical transfer rate on a Gigabit network".

I have always thought it was 100 MB/s, but then someone asked me how I calculated that. truth be told I just know this number from reading and equating 100MBit with 10 MB/sec and thus taking the logical step to 100 with adding a zero. Can someone please help me with the equation.

1,000 MBit divided in what gives me the correct number?
Is it 8? Cause that gives me 125 MB/s

asked 12/16/2011 12:10

somewhereinafrica's gravatar image

somewhereinafrica ♦♦


4 Answers:
obviously look at the conversions bit
link

answered 2011-12-16 at 08:16:33

xmlmagician's gravatar image

xmlmagician

It takes 8 bits to make one byte. Therefore, 1 billion bits (1Gbit) divided by 8 equals 125 million bytes (Mbit). Actual throughput can vary due to many factors including network drivers,  the network equipment and its quality, packet size, geographic distance between sending and receiving clients, file size,  disk speed, disk fragmentation, and other factors.  As a result,  it's usually a safe bet to divide by 10 and consider it to be overhead/the "other factors).
link

answered 2011-12-16 at 08:17:12

leew's gravatar image

leew

@leew
great, I just needed someone to tell me I'm not crazy, and what the magic divider number is.
link

answered 2011-12-16 at 09:05:26

somewhereinafrica's gravatar image

somewhereinafrica

Your answer
[hide preview]

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Tags:

Asked: 12/16/2011 12:10

Seen: 263 times

Last updated: 12/16/2011 02:37