Open Access

I’ve recently been fielding queries about Open Access, and what it means for Community Broadband providers.

My article in Lightwave Magazine discusses trade-offs and implementation, as well as touching on rationale and business models, for Open Access networks.  Since I wrote it, Open Access has continued to be featured in Community Broadband RFPs.

Most municipalities and co-ops would be well-advised to avoid being in the Internet Service Provider business.   ISPs must provide a wide variety of customer service,  operations, network administration, network engineering and network security functions. Doing so requires acquiring  a staff with relevant skill sets, some of them on a 24×7 basis. Economies of scale matter. While larger municipal electric providers like those in Chattanooga, TN and Norwood, MA can support ISP operation, smaller community broadband projects  often cannot.

Many consumers still insist on “triple-play” services (Internet +TV + voice).  Pay television is no longer an attractive business, because of rising program costs and “cord cutting”.   Some analysts claim that the current business model is collapsing.   Small and emerging providers are distinctly disadvantaged in this space. Again, economies of scale matter, especially for leverage with content providers.  So do depreciated capital assets. At present, there is little upside to the TV business for  new community broadband networks.

Having decided not to be an ISP or triple-play provider, community broadband providers have several options for supporting these service providers.   Most relevant to this discussion, they can contract those operations out to a single third party service provider, which in turn provides Internet and Triple-Play under the community’s brand.  Alternatively, they can wholesale network capacity to multiple third-party service providers, each of which provides service under its own brand. The wholesale model is a defining characteristic of Open Access.  Both approaches have been used.  Of course, there are pros- and cons- to each, and trade-offs between them.

Well established FTTx networking technologies provide solid support for separation of Service Provider from Network and Infrastructure provider.  Again, there are numerous options.  Recently, the trend toward Software Defined Networks (SDN), Network Functions Virtualization (NFV), network Cloud, “white box” hardware and open-source software has created new ways of envisioning Service Provider/Network provider delineation.  These architectural models promise to support a variety of potentially interesting business models.

I’d be happy to talk with you about all of the options and which one works best for your situation.  Contact me at info@netaccessfutures.net.

Mind your Money

My parents and grandparents lived through hard times. They taught me the virtue of thrift: “use it up, wear it out, make it do or do without”; “a penny saved is a penny earned”; “waste not, want not”; “money doesn’t grow on trees”; “take care of the pennies, and the dollars will take care of themselves”; “the bitterness of poor quality outlasts sweetness of low price”. Professionally, that philosophy translates to the term “value engineering”.

So I have something I have to get off my chest. All too often, community broadband deployments cost more than they have to, for reasons that do not create commensurate value. Or, to put it bluntly, money is wasted.

Anybody who’s been following community broadband recognizes the difficulty of funding these vital projects. Municipalities, co-ops and small utilities seeking to build broadband infrastructure must cobble together a financing plan from various sources of money: federal and state “alphabet soup” programs, private grants, municipal bonds, bank loans, taxes, assessments, fees, retained earnings from utility operations, etc. Finding the money requires time and effort. Fund-raising sometimes falls short of project cost estimates, killing or delaying the project. Time to positive cash flow is scrutinized by potential lenders and bond ratings agencies. Funding a broadband project can affect other community priorities. Most important, project cost is ultimately paid for by the public — it’s their money and their trust.

For those reasons, every dollar of capital expense and future operating expenses must be spent wisely. That goes beyond public ethics and fiscal responsibility (about which, unfortunately, the FCC recently needed to remind some folks). It also means having a clear-eyed understanding of what the project is supposed to accomplish, eliminating or postponing anything not critical to that mission, and optimizing for low life-cycle cost with high value.

Some activists see community broadband as a way to create competition, overthrow the incumbent monopolies, enable anyone to create killer applications, unlock hidden creativity, actuate democracy, revolutionize the economy, empower citizens, and save the world. Grandiose visions lead to grandiose projects. Pragmatically, those are not the problem that communities need to solve for themselves. Their problem is concrete and immediate: lack of adequate broadband service, at reasonable prices, with good customer support. This problem, with its familiar litany of symptoms – frustrated citizens, businesses driven away, kids doing their homework in the library parking lot at night, adults unable to telecommute, depressed property values, etc. — is self-evident. Solving it should be the laser-like focus of the business plan, network architecture and budget.

I keep hearing things in community broadband conferences, blogs, and webcasts which make me cringe, knowing that they lead to overly expensive projects:

  • “As long as we’re putting in fiber, we might as well put in lots of it in fat cables. Who knows what we might want it for later?” Fiber-rich deployment can be appropriate in the backbone of the network: the middle mile, long haul and between data centers. It doesn’t scale to the extremities: residential and small business access networks. There, the cost of more fiber mounts quickly, for a lot of not-always-obvious reasons. I’ve written extensively on this subject for my clients. And by the way, a looming global supply shortage of optical fiber portends higher prices in the near future. Last mile networks should be designed fiber-lean, with reasonable margin for expected growth, plus a little extra.
  • “Every customer should have a dedicated point-to-point fiber, all the way to the switch….” In urban centers, “home-run” architectures are sometimes the lowest cost solution. They are also justified for large customers like office towers, data centers, hospitals, universities and wireless systems. However, for community broadband projects primarily serving residences and small businesses, shared fiber Passive Optical Networks (PONs) are usually the most cost-effective architecture. This is particularly true in rural and semi-rural areas, because of the distances involved. Home-run architectures can dramatically inflate project cost – sometimes by more than 100%!
  • “…so they get a first-class experience.” Dedicated fiber does not offer a perceptibly better customer experience than properly engineered shared fiber. The reasons are complex, as I explain in an article that I wrote for Broadband Communities The bumper sticker version is that a fiber’s bandwidth is either being used or wasted at any instant of time. Each customer actually uses only a tiny fraction of the available bandwidth. If that bandwidth is shared, more can be used and less is wasted. With proper engineering, there’s almost always more bandwidth available than is being used at a given time. This kind of resource sharing is how the Internet works, and is the main reason for its success. There’s no reason why the first mile should be any different.
  • “…because we’re building an open access network.” Some broadband network plans require “open access”. This means that infrastructure is owned by a Network Provider (NP), and shared by multiple Service Providers (SP), which offer Internet and other services to retail customers. This recognizes that infrastructure tends to be what economists call a “natural monopoly”; capital cost, rights-of-way and market dilution present high barriers to new competitors. Open access brings consumers the benefits of competition for services, which are not natural monopolies. Open access can also simplify community broadband business plans by assigning ownership of lots of headaches to the SPs, which are organized to deal with them.The boundary between the NP and the SP can be drawn in several different ways. Some of them involve dedicating a point-to-point fiber from each customer to an NP-owned interconnection point, to which the customer’s chosen SP connects their own dedicated fiber. These approaches, in addition to suffering from the higher cost of dedicated fiber, present a number of operational problems. For example, when competition incents customers to switch SPs, change orders have to be performed by manually disconnecting and reconnecting fibers. Determining responsibility for problems can be tricky, and often ends up in a finger pointing contest between SP and NP.Instead, I advise interconnection between NP and SP using “virtual open access”. “Virtual Local Area Networks (VLANs)” are a standard Ethernet feature which is integral to common FTTH technologies. The NP supplies VLANs to connect each customer to the SP of their choice. Change orders are performed entirely through software. Virtual open access has significantly lower capital and operating costs, and fewer headaches, than physical open access approaches. My recent article in Lightwave Magazine discusses this in detail. As a bonus, virtual open access makes it easy for a customer to have more than one SP (for example, one for family use and one for a home office).
  • “Symmetric bandwidth is an absolute requirement.” Residential and most small business customers consume much more traffic than they generate. This fact is well confirmed by measurements of live networks, and easily explained by application requirements, consumer behavior and typical small business needs. No foreseeable application is likely to change that. In fact, growing mass-market penetration of 4K/UHDTV video will increase the asymmetry.Asymmetric bit rates in some FTTH technologies have a valid engineering rationale, unrelated to sinister industry plots: cost optimization. The reasons have to do with the manufacturing costs of various kinds of lasers, and the way that light travels through fiber. It’s a long story, of interest only to fellow engineering geeks. Other FTTH technologies attain bit rate symmetry either at higher cost, or by unnecessarily restricting the downstream bit rate to match the most cost-effective upstream rate.Asymmetric bit rates do not preclude symmetric services. Many providers use a PON technology called GPON (2.5 Gigabits per Second (Gb/s) in the downstream direction, 1.25 Gb/s upstream) to offer 50/50, 100/100, 500/500 and even 1000/1000 Megabit per second (Mb/s) services. They do this by taking advantage of bandwidth sharing and the asymmetry of user traffic. It works.
  • “Gigabit is so cliché. We’re going to offer 10 G”. If bragging rights and one-upping other communities are primary objective, then go for it. However, equipment is significantly more expensive – by multiples of 1.5 to 3 for 10 Gb/s, 5 to 8 for 40 GB/s per fiber. Few customers can utilize 1 Gb/s service, never mind 10 Gb/s. As I explained in my Broadband Communities article, 10 Gb/s doesn’t even necessarily offer 10 times the performance of 1 Gb/s. I recommend marketing a 1 Gb/s flagship service to residential and small-to-medium business customers, while offering large customers 10 Gb/s service by special order.
  • “We can’t afford paid experts. Some of our volunteers have science and technology backgrounds. They’ll figure it out.” Community broadband efforts usually start with activists who volunteer their time and know-how. Occasionally, a volunteer might have worked in a planning, engineering or technology role at a telecom or cable company. Or there might be a volunteer with a background like mine, on the equipment vendor side of the telecom industry. Not many community broadband projects are fortunate enough to have those resources. Technical committees tend to consist of information technology (IT) professionals, software developers, engineers and scientists in unrelated disciplines, and technology enthusiasts. While these folks are presumably highly intelligent and knowledgeable in their own field, they lack specialized knowledge of FTTH technologies and FTTH engineering economics. Without guidance from subject matter experts, they are likely to lose their way, and fall into the traps I discuss above. Some technical committees spin their wheels in poorly informed arguments and thus can’t agree on direction, make decisions, determine next steps, or generate actionable plans. Not bringing in appropriate expertise for these projects is “penny wise, pound foolish”.

These themes recur all too often. I hate to think how much money has been wasted because of them. There’s too much misinformation and hyperbole out there, and too often the perfect becomes the enemy of the good.

If you’re organizing a community broadband network, I want to help you make sense of all of this, so you don’t waste your money. Contact me at: info@netaccessfutures.net.

 

 

It happened.

Unless you’ve been hiding under a rock, you know by now that the FCC voted – on party lines – to reclassify broadband access as a Title II telecommunications service (rather than as an unregulated information service) and re-apply the so-called Net Neutrality rules on that basis.  To telecom regulatory geeks like me, the webcast Open Meeting and dueling press conferences afterwards were a great show.

I doubt all this will have any substantive effect.  Here are my speculations.

There will, of course, be litigation, probably framed around the legal arguments in Commissioner O’Reiley’s dissent.  The composition of the DC Circuit has changed since the last round, and the FCC has apparently followed the less risky of the two road maps laid out by the Court in the Verizon decision.  I am not a lawyer, but as a keen observer, I’d expect the FCC to prevail in this one.

There will of course be attempts at legislation.  It is doubtful that any of them could get a cloture vote in the Senate in this term, and even more doubtful that they’d pass an override vote.  There will be some posturing in committee hearings over the next few weeks, but it won’t come to much.  In the worst case, there will be another standoff over the FCC’s budget.   A shutdown would stop the shot clock on two mergers, plus halt day-to-day licensing transactions and equipment approvals.  It can’t last too long.

I doubt that the Order is going to prevent the incumbents from doing anything they’d actually planned to do.   That depends on the exact technical meaning of “paid prioritization”.  If taken literally to include any network behavior other than Best Effort, that could monkey wrench managed VoIP, tele-presence and IPTV, as well as some IoT applications.  I’ve been concerned that this might happen since the NPRM was released, and explained in comments why it would be a bad idea.  Others have offered the same advice.    We shall see what comes out in the Order.

I’d also be surprised to see significant changes to CAPEX.  Regardless of regulatory status, either there is a business case for new deployments and upgrades, or there isn’t.  Each of the big players has a story line, and I can’t imagine a business rationale  for any of them changing direction because of this Order.

The activists will spend the next couple of weeks enjoying their victory.  The real win was not the inconsequential “no blocking, no degrading, no paid prioritization” rule.  It was in applying the “just and reasonable” standard to broadband ISPs, and giving the FCC authority to adjudicate consumer complaints.  My guess: it would take a particularly egregious act for the Commission to take enforcement action based on consumer complaints.  The incumbents are usually too smart and too cautious to do such things.

Finally, I expect a proceeding on interconnection, either as a rule making or as an enforcement action in response to a complaint.  This is the real issue that powered the astroturf campaign for Net Neutrality.  This is somewhat of a wildcard.  My guess is that Netflix, Cogent and Level-3 are overreaching in their demand for settlement-free peering, with unlimited capacity upgrades at no cost to themselves.   We shall see.

The devil is in the details.  It may take a while for the final text of the Order to be complete.  Despite the spin, this is has been par for the course at the FCC for decades.  Chairman Wheeler pointed out that a recent court decision compels the majority to provide a written response to the dissenting statements.  This, of course, will not happen on “Internet Time”.    I’m waiting patiently,  with bated breath.

Network Neutrality Update

At CES 2015, Tom Wheeler spoke about the current thinking inside the FCC.

It seems that the “commercially reasonable” standard that the Commission would have to draw upon to hang their hats on Section 706 is ambiguous.  A mere layman would read it as meaning “reasonable to all parties”;  a plausible result of a hypothetical negotiation in which the parties had equal  power.  Apparently, the FCC’s attorneys were able to torture it into meaning “reasonable to the Broadband ISP”;  a plausible result of a hypothetical negotiation of lunch options between a wolf and a sheep.  This is one reason why I am not a lawyer.

According to Wheeler, a more robust “just and reasonable” standard would require that the Broadband ISPs would have to be subject to Sections 201 and 202 of the Communications Act, and adequate consumer protections would also require Section 208.  And yes, these are sections under the dreaded Title II.  The rest of Title II would be excessive.  It was not clear to me that the FCC could forbear from all but those three sections, as a matter of law or politics.  It now seems that the attorneys think they have enough legal legerdemain do this.

With respect to my big concern with regard to possible over-broad interpretation of “paid priority” and “throttling”:  the framework under discussion appears to give the FCC enough wiggle room to decide that, for example, a video transport service offered to all comers is “just and reasonable”.  Wheeler also acknowledged that No Paid Prioritization should not be an absolute.  So overall, not a bad outcome.

In the meantime, various bills are being hammered out in the House and the Senate to short-circuit the FCC proceedings.  A law that gave the FCC authority to deal with blocking, throttling and paid prioritization would provide some certainty.   That is, if we can trust this Congress to let the FCC do what they have to, and don’t try to micromanage.  I’m not taking bets.

Commission will vote on a Report and Order at their February open meeting.

Hacked!

Over the holidays (naturally) some [expletives deleted] inserted some malware into this site.  It redirected viewers to an attack site, no doubt to nefarious ends.   The vulnerability appears to have been in a WordPress 4.0 script.  We’ve updated to WP 4.1 and taken other necessary steps.

Lessons learned:

  • Web sites aren’t “set and forget”.
  • Being a low-value target is not a defense.
  • Security updates don’t happen automagically
  • Don’t trust the tools to set up protections appropriately.
  • Google and stopbadware.org are your friends… in the “tough love” sense of friends.
  • My first incident response involved a lot of trial-and-error.  I can’t imagine how a site owner with no CS background could begin to deal with these kinds of problems.

Anyway, all clear.  Maybe someday law enforcement will catch up with these [expletives deleted].

Observations from the Broadband Communities Economic Development Conference

Last week, I attended a conference in Springfield, MA, sponsored by Broadband Communities Magazine, on broadband for economic development. This was the most cross-functional business event I’d attended in a very long time.  There were town planners, activists, civil servants, elected officials, academics, consultants, economists, not-for-profits, educators, healthcare experts, regulators, lobbyists, lawyers, trade groups, analysts, design/build firms, hardware, software and outside plant vendors, among many others.  So in addition to a big stack of business cards, here are a few things I brought back.

What brought all these people together was a problem:  business and residential broadband access is now a necessity,  yet many communities are un-served or under-served by incumbent telecom and cable providers.   Community leaders recognize that since the incumbents cannot make a business case for investment in their communities, the public sector has to step in.  The conversation revolved around “how?” rather than “why?”

Springfield would probably not come up on the typical  conference organizer’s screen.  Yet it is an ideal place to talk about community broadband.  New England is a hive of activity.  All of the states have broadband plans and are in the process of planning broadband networks.   In particular, Springfield is close to WiredWest, a consortium of un-served western Massachusetts towns.  A few western Massachusetts towns outside the consortium are planning their own broadband networks.  A new initiative for broadband deployment in Connecticut was just announced; Springfield is a short drive from Hartford.  Vermont is a straight shot on I-91, and New Hampshire not much further.  And finally, Springfield is the hub of Mass123, the middle-mile network serving the western part of the state.  So there’s lots of good stuff happening within easy driving distance.

My world is centered on the technological  complexity of broadband networks.  So I learned a lot about the many political, regulatory, financial, Federal funding,  legal,  marketing, urban planning, educational and economic development complexities with which the technology must mesh.  I was particularly happy to get a deep dive into the financial analysis toolkit developed by our sponsor.  Attorneys from Baller-Herbst took their audience through a great survey of the legal landscape.

All of this activity is fueled by small piles of Federal cash, plus various State sources.   In particular, the critical middle mile infrastructure was built out mostly with Stimulus money, through the Broadband Technology Opportunity Program (BTOP).  There’s FCC money for educational user under the e-Rate program,  Universal Service Fund money,  Connect America Fund “Experiments” grants.  There are USDA grants.  There are HHS grants.  State grants.  Private philanthropy grants.  And they are all painfully hard to get.   Somehow, the would-be network operator has to cobble together enough of this “free” money to make a financial plan that would pass muster in the (troubled) municipal bond market, or at a bank.  A lot of the folks I met are determined to do that, and came to the conference to figure out how.

I got to meet a lot of people.  Overall, this was an impressive group.  I knew some of them from my past life, some I was meeting in person after first meeting on the web, some I knew by reputation.  I enjoyed a number of deep discussions, on history, economic development, Net Neutrality, and more, as well as the topic at hand.

I did not get to meet very many of my technology peers.  Many of these organizations are running on a shoe-string, or haven’t gotten to the point of figuring how to make their networks work.   Some are drawing on volunteers – local IT professionals,  civil and mechanical engineers, scientists, academics and enthusiasts – to inform the planning and procurement processes.  I heard that they were having fierce internal debates about  topics I’d thought had been settled.   I heard about RFPs that cost more in change orders than the original award. I heard subtle misunderstandings that could lead to poor decisions.   And the most of the technical folks I talked with recognize that broadband is outside their professional experience, or need a second opinion.

Plug:  This is the problem that NetAccess Futures is in business to solve.

As an aside, New Haven, CT is one of the three Connecticut cities that is undertaking a broadband infrastructure project.  It also happens to be the city I grew up in, and my family was deeply rooted in the New Haven community.  It was a privilege to shake hands with Mayor Toni Harp.

I’d like to thank Broadband Communities Magazine for their organization and hospitality.  This is not a typical trade rag.  One pleasant surprise was the degree to which the magazine supports this diverse community.  Plus the editors are multi-talented and exceptionally intelligent and passionate about broadband.  In particular, Steve Ross gave me the better part of an evening for a very enjoyable conversation, and valuable business advice.  For whatever it’s worth, I’ll have an article in their October edition.

 

Divide and Conquer

I’ve argued that the Network Neutrality battle is really about a number of different issues, some of them having little to do with transparency, blocking, and unreasonable discrimination.   Taken together, they are a hairball of  corporate tussles and consumer grievances.   Sometimes, the best way to solve a complex problem is to break it into simpler problems.  This is one of those cases.

FCC Chairman Tom Wheeler has started to do that.  Last Friday, he announced an investigation into the set of issues surrounding interconnection, or “peering”.    To consumers,  recent periods of unacceptable video performance involving Netflix, their transit provider Cogent, Comcast and Verizon look like unreasonable discrimination.      They are actually caused by congestion at points of interconnection between Cogent and the broadband ISPs.  This in turn is caused by a commercial dispute over terms of peering arrangements.    This issue has overshadowed and confused the Network Neutrality (or, as the FCC calls it “Preserving the Open Internet”) debate.

The Internet is literally just that: a network of interconnected networks.  It is an abstraction, realized as individual networks,  the physical connectivity between them, the data exchanged among them using the Border Gateway Protocol (BGP) and some administrative legerdemain.    While it is a truism that “nobody owns the Internet”,  some entity owns each of the networks that comprise the Internet.

Interconnection is a dark corner of the Internet in need of sunlight.  To tell the truth, a lot of it was a complete mystery to me until fairly recently.  Brough Turner wrote up a great tutorial on the topic.   Please read it before continuing with this screed.

The current interconnection mess  is yet another example of the Internet’s growth pains.  Friendly handshake agreements that “I’ll take your traffic and you’ll take mine, and it will all even out”  functioned well in the environment in which they evolved.  This reciprocity worked  because the economic interests of the networks were symmetrical, with each benefiting equally from the interconnection, and each bearing similar costs.   To the extent that there was paid peering, overage charges or port congestion charges,  it wasn’t a serious problem that pricing and terms were individualized and confidential.   In a world where traffic flow has become highly asymmetrical, and the interests of networks which are content providers diverge from the interests of networks that are access providers, the old, informal system has become a point of contention.

To be clear, a press release and a news conference do not constitute a regulatory action by the FCC.  They must adhere to a process specified under the law and their own procedures.   So this is Step 0 of the process.  What it means is that some of the beleaguered staff (presumably most of the from the Wireline Competition Bureau) are conducting an investigation into the way interconnection works.  They have access to confidential contracts, such as the one between Netflix and Comcast, and get to hear all sides of the story.  The information they gather will inform the formal regulatory process if and when it goes forward.

This is a positive development.  If it plays out to its logical conclusion, everybody will have the rules in writing, and temptations to use interconnection as an anti-competitive weapon will be thwarted.   Stay tuned.

By the way, thanks to Jim Sackman for the shout-out.  Jim and I used to work for competing organizations, and either because of or despite that,  share a lot of perspective.  He tries to blog on Network Neutrality every Friday, and it’s well worth a read.

 

 

Democracy is Messy

I’m still digesting the Open Internet order, and intend to post my thoughts and comments later.   On the surface, it is fairly bland and inoffensive.  It tentatively suggests a way forward, but also seeks guidance.  And with the help of the Web, much guidance is forthcoming.

There are, unfortunately,  many countries which actively suppress public participation in policy matters.  Lashing out against oligarchs, cronies of the Leader, Party officials, well connected businesses and the like can have dreadful consequences.    Corrupt rent-seeking is accomplished through secretive discussions between businesses and officials.   Citizens of these countries suffer both tangible economic and physical harms and intangibly, through sense of powerlessness and oppression.

We Americans are fortunate that the rights of free debate and petition is guaranteed by our Constitution, and that these rights are institutionalized in the fabric of our government.   We are also fortunate that (with few exceptions) the machinery of government is bathed in sunlight and that public access to policy debates is enforced by law – in this case, the Administrative Procedure Act of 1946.   Each of us is entitled to our opinions on matters of public importance, to express them vigorously to our government, and to see that they are properly considered.   Writing this on the 70th Anniversary of the landings at Normandy, I cannot help but reflect upon the direct connection between the sacrifice and heroism of that day, and our rights, privileges and the Rule of Law.

That said,  there is a difference between an opinion and an informed opinion.  When I was a know-it-all kid, my parents and teachers rightly impressed upon me a duty to inform myself before spouting off.  Sometimes, it must have seemed like a losing battle.   More than once, I was admonished that “when you become an expert in such-and-such, you can debate with the experts;  in the meantime listen more and talk less”.    Or, in the words of Lady Burton:

 Men are four:
He who knows not and knows not he knows not, he is a fool—shun him;
He who knows not and knows he knows not, he is simple—teach him;
He who knows and knows not he knows, he is asleep—wake him;
He who knows and knows he knows, he is wise—follow him!
Lady Burton—Life of Sir Richard Burton.

Now that I am an expert in a field that has become contentious, I fully appreciate the wisdom of that teaching.

Which leads back to Network Neutrality.   Public opinion has become inflamed about a cluster of esoteric topics.  There are undisputed facts, disputed facts, speculations, conclusions and opinions.  Unfortunately, the policy debate has been overwhelmed by the latter.

“Net Neutrality” is not devoid of intellectual roots.  It is based on reasoned analysis by  legal experts, most notably Tim Wu,  Susan Crawford and Larry Lessig.  Their work deserves thoughtful debate on its merits.  My take is that much of it is premised on misunderstanding of selected, out-of-context facts,  and developed through various logical fallacies.   I believe that I can rebut most of their points.

The debate also has deep roots in the technology community.  For many years, there has been schism between those who see network resources (“bandwidth”) as abundant, and those who see them as scarce.   In the view of the former, attempts to allocate presumed unlimited capacity is unnecessary, foolish and harmful.  They see all traffic fitting into a common packet forwarding paradigm.   The latter hold that at times,  traffic demand at some points in the network will exceed capacity, that different traffic sources have different performance (“quality of service”) requirements, and that as a result, congestion must be controlled by traffic management  regimes specific to the needs of the source.   I am firmly in the latter camp, as are many other respected experts;  there are many respected experts on the other side.  We have reasoned, if somewhat heated, debates on the topic, all based on a set of undisputed facts.

Pecuniary interest is a motivator on both sides.  It can be argued that at its root, this is a tussle between facility-based broadband providers and content providers over their respective shares of a slice of consumer discretionary income.   The latter have succeeded in portraying themselves as victims, and their opponents as greedy, highly profitable monopolists.  This despite the fact that the same charges can be leveled at some of them.   Some even attack the broadband providers for business practices at the access/transport level, while engaging in those same practices (and worse) at the content level.  Such behavior is expected, and sorting out these tussles is an important function of the FCC.

Then there is the common mob.  Following the call to arms by the likes of Mr. Wu, Ms. Crawford, and Mr. Lessig, as amplified through self- appointed public interest groups like Free Press, Common Knowledge and Save the Internet, and building on other frustrations with broadband operators, the topic has generated inarticulate rage.   In fact, so much that the FCC’s capability to process all the comments has been overwhelmed as if by Denial-of-Service attack.  It is fitting that comedian John Oliver’s rant on Last Week Tonight,  (which apparently froze the FCC’s web site), was a call to action by the Internet’s population of trolls, fanboys, haters and flame writers.

Reading a selection of the more than 2600 comments in the FCC’s EFCS database is  discouraging.  Many are inarticulate rants, more like toddlers’ temper tantrums than contributions to a policy debate.  Many are obviously copied.  Many are off-topic, devoid of basis in fact, built on fallacy and misunderstanding.   Many give cause to mourn the state of public education in English grammar and composition.   And most consist only of one sentence fragment.

I imagine a small army of staffers in the FCC’s headquarters, trying desperately to keep up with the flood.   Their task is unenviable.  I rather fear that thoughtful, original comments – on both sides of the issue – will be lost in the shouting.

 

 

 

Broadband Access Strategy and Technology