This is a video from a camera on one of the Space Shuttle’s Solid Rocket Boosters. The cool part is that it has sound!! I know I’m never going to fly into space so I live vicariously through videos like this. If you’re impatient the video gets good at 1:50 when the seperation happens.
The timer in the upper left is launch time (T+). Notice how the other SRB doesn’t stray too far from the camera and can be seen against the earth. Plus you can see the smoke trailing from the falling boosters and in the far distance the smoke column cause by the initial shuttle launch.
The cool part with sound is you can hear the change in noise as the air thins, plus the rattle of debris impacting on the booster casing, the ‘chutes deployment and inflation, and finally the impact into the water.
The SRB’s are basically giant bottle rockets; once they’re lit the only way to shut them off is to let them burn out or self destruction (which only was used once after the Challenger accident).
The SRBs only burn for about 2 minutes then jettison from the shuttle at approx 27 miles up. Their momentum is so great that they continue up to about 41 miles to the peak of their arc before falling back to the earth.
A small drogue chute orients the SRBs in an upright position and about a mile up the three main chutes open so the 91 ton empty cylinder won’t be damaged on impact. I never knew until now but the chutes are held partially closed (or “reefed”) until a set speed when they can be fully opened; otherwise the sudden full inflation could shred the chute or snap the cables holding it.
The empty booster is only open at the bottom so landing tail first seals air inside the rocket cylinder causing it to float upright sticking about 30ft out of the water.
Roughly 6 minutes and 130 miles off the coast of Florida the boosters end their short trip to the edge of space. In the past they would be recovered and be used 4 or 5 more times but with the close of the shuttle program they’re just collected for scrap now.
Like I said yesterday, Wired’s article is already making waves. Chris Anderson was interviewed on NPR about it this morning and this afternoon it made the news crawl on CNN.
One thing I like from the NPR interview is that Chris mentioned that by dead he’s talking about Web transitioning to Mobile. Which in a way is very true. Although he still talks about how applications rule and that they will kill the web.
Here’s an experiment to see if he’s right: Use only apps, no web browser.
Go 2 days without ever opening Firefox, IE, or safari, chrome, etc.
Don’t use Google (it’s a WEB page).
Try getting the things you want done with only dedicated web apps. No diversity of the millions of online web pages, just the 20 or so apps you can load before your phone fills up.
Don’t be fooled by apps that redirect you to a browser, they’re cheating.
Basically Chris’s prediction of the future of the web is where the multiverse of web pages is boiled down to a handful of corporate apps that port and filter the web for you. Much like the archaic AOL days in internet prehistory. And that scares the shit out of me.
Luckily he’s wrong!
Rob Beschizza edited the fact distorting graph used by Chris for the wired article to better fit reality. Pay close attention to the red “web traffic” That is “dying”. This is just the same graph but adjusted using the same data used for Wired’s article to reflect the actual amount of traffic passed in each category.
In Wired’s article it shows web use as a percentage against other high bandwidth internet traffic. Now that we can see the actual amount of web traffic we can see that in the last 5 years the web has almost tripled. Rob summed up Cisco’s data best:
Assuming that this crudely renormalized graph is at all accurate, it doesn’t even seem to be the case that the web’s ongoing growth has slowed. It’s rather been joined by even more explosive growth in file-sharing and video, which is often embedded in the web in any case.
In regards to using “bandwidth” to measure the value of internet traffic.
Does 50MB of YouTube kitteh represent more meaningful growth than a 5MB Wired feature?
It’s worth noting that we’re talking generalized numbers and graphs and that there will be a bit of variation in the data. But the web is still a LONG way from dying. Harry McCracken at Technologizer has another great article pointing out other technologies that have “died” recently (Hint Facebook died 2 years ago but Vinyl is alive and well).
This is one of the reasons I quit subscribing to Wired. Idiotic, sensationalizing, articles.
Now I fully appreciated the irony that I complain about Wired sensationalizing articles to draw viewers; and that by posting this I’m part of the problem, taking the bait hook line and sinker. But this article is going to be splayed across the internet and the news simply because of the source, and it needs to be killed now.
It’s the same tired argument that has been out since the iPhone and has sped up since the iPad. “Apps” and online video streaming are going to take over the internet and surfing web pages as we know it will cease to exist. Basically Chris is channeling a Steve Jobs presentation (or even plagiarizing one).
As much as we love the open, unfettered Web, we’re abandoning it for simpler, sleeker services that just work. -Chris Anderson
At least he didn’t call the services “magical”.
The graphic showing a shrinking web is hard to ignore, and I heard that 95% of online stats aren’t made up or distorted.
The reasons to scoff at head editor Chris Anderson as a moron?
1. The diagram is from 1995 (i.e. 7 years before most people used the internet), to 2005 (i.e. half a decade ago, 2 years before Job’s iPhone app revolution).
In Chris’s defense, 2005 was before the magical apps and services Chris describes even existed so they wouldn’t show yet.
2. “Web” is used here for a general catch-all that fits alot of very different and dynamic services.
3. Anybody with an office job knows that email rules the word. Even including spam it shows up non-existent on this graph. Pointing to how this graph doesn’t reflect reality of the web.
4. Apps and services are just a frontend to parse web data. The web is still there, you’re just using a very specialized browser to access it. The Facebook app is nothing without the Facebook itself.
5. The MAIN problem with the graph is that it is a measurement of bits of traffic and not representative of the web experience.
Text on the internet is the smallest part of it. This entire article takes up the same space as a 1”x1” image. On a boring static webpage the images take up 90% of the space. To put this in perspective in 2006 Wikipedia (the entire thing) was 1.2 Terabytes in size; the whole thing could fit on one large hard drive (can you say real life HHG2G?).
Videos on the internet take up MUCH more space than anything else, especially if you’re watching a HQ youtube or hulu stream. 10 minutes of HQ youtube will pass as much traffic as all the surfing you’ll do on Wikipedia for the next few months.
Suddenly the above graph makes much more sense. Even if online video made up 90% of web traffic it would still mean that more time online is spent just surfing the web. And this is why it’s shocking the editor of Wired Magazine wrote this article, it horrible mis-represents the data provided by Cisco about web traffic. Much more useful would be how much time people spend on different web sites. However that’s much harder to measure.
I could point to their failed and horribly thought out Garmin Phone as an example but this is something much more basic that all their new products have.
About a year and a half ago I got a Garmin Nuvi 250, the price on them dropped to $100. You may have noticed on the road that A LOT of people have GPS in their cars now. This recent price drop is why.
Anyway about a week ago it told me to update the maps. Makes sense, there are a lot of places I drive that have new roads not on their maps. I hit cancel and forgot about it. Then today it nagged me again to download maps. So I go online and start the process of updating the GPS.
First off plugging the GPS into USB killed my keyboard. I don’t know why. I had to plug the keyboard into a different port to get it back, at least it didn’t fry it like the external hard drive I had a few years ago.
Then to get the GPS to update you have to goto Garmin’s website and download a browser plugin that detects the GPS. This involves a lengthy registration process I didn’t want to do. Last thing I want to do is give my email address and physical address to YET ANNOTHER company to spam me.
Now I had the plugin running and the GPS plugged in. But it wouldn’t detect the GPS
Move the GPS USB to another port.
Keyboard dies again.
Move the Keyboard back to its original port.
3 Minutes later the GPS finally connects.
Finally the GPS is discovered by the browser program.
“Click here to check for updates”
“Click here to download update.”
Finally the update goes through and I check the Maps update. There are two options, first is a lifetime update service that costs $120.
Yes One Hundred and Twenty dollars.
Or a one time update that costs $70.
Keep in mind that Garmin street maps aren’t all that great. When the GPS was new a lot of the streets were already out of date. Plus I’m constantly aggravated by the fact that the maps never start out zoomed to the level where you see surface streets, I always have to zoom in one level.
It also tries to redirect me onto streets that I know are slower. On the way to Bear Lake instead of taking I-15 north and going 75MPH (posted) it wanted me to take a back highway to Brigham City. Admittedly highway 89 is a beautiful drive and lined with fruit stands from all the nearby orchards.
But it’s slower!
All these gripes with the GPS and they want me to pay for a map update that costs the same price as the whole flipping GPS itself. In fact I can just buy the newer model for the same price and I’m sure it would have a more up to date map in it.
Meanwhile my Android phone does all the features the Garmin does. But it also gives me:
Maps that are as upto date as Google’s online database.
An application that updates over the air bi-monthly.
Satellite view of the surrounding area.
Current local traffic conditions.
An ETA adjusted for traffic.
Street view pictures of the intersections I need to turn at.
Current location of any friends and family with Latitude.
The ability to search and route to any nearby business, gas station, or ATM.
And best of all it’s FREE!!!
So as soon as I find a good dashboard car mount for my phone I have a Garmin Nuvi 250 GPS for sale. Then it’s good bye and good riddance to Garmin.
It’s not April fools, not a concept, it’s for real.
Ed Fries, former VP of Microsoft’s Gaming Division created it as a pet project. You can read more about it here.
Best part of all you can play it now, even if you don’t have an old Atari 2600 hanging around.
see more Political Pictures
I recently lost my job; it’s no big deal, I was already looking for a new one because I felt underutilized and under recognized but having the NEED to go find a new job is never fun.
I worked for AT&T on the EVEN turn-up team; basically that means when a big company gets a new T1 or DS3 line to connect their main an remote offices together, I’m the guy who sets up the routers so they work. So a technician onsite plugs everything in then I remotely connect and configure the device to work as the gateway between the customer LAN and AT&T’s backbone with VPN tunnels to other sites belonging to the company.
Already this leads to a job that is very much in danger of being outsourced since I don’t have to actually be anywhere near the equipment; from Orem I was connecting to devices all over the US setting them up. If you can connect in from 1000 miles away why not just do it from 2000 miles away, thus to save money the company decided it would be better to have people in Slovakia do the job. Many other tech companies have already sent their services overseas, I’m sure that most people have talked with a representative from overseas, the tech industry is no different.
Obviously there are issues with having your talent located thousands of miles away from your companies’ equipment. In fact one time it became very apparent the problems with having people remotely configure systems.
One night I was migrating a customer site from their old routers to a new high-speed connection that required new equipment. The AT&T equipment was already setup and running but the customer LAN hadn’t been moved over from the old equipment to the new. To make the change we had to changes it over when everybody had gone home so as not to disrupt the network during business hours.
I was in a conference call with the customer’s LAN technician who himself was an outsourced tech working in India. His accent was thick but bad and he was a very friendly guy; while we waited for the customer’s onsite tech to arrive we chatted for a while. It was evening for me and pre-dawn for him so being away from home on the odd shift.
The problem is that the tech never showed. So what we had was an unmanned datacenter in the New England area in a remote locked building without so much as a night security guard to help us. Since the two routers involved were mounted right on top of each other all we needed was a person to move a yellow wire from one connector to the identical connector 2 inches above it.
Both me and the Indian tech were helpless to do anything, each remotely connected in from two opposite points on the globe to two devices humming along next to each other, two inches from a job well done. All we needed was any flesh and blood human onsite to perform a task that required absolutely no technical knowledge whatsoever.
Outsourcing had affected the Company in question so much that there was nobody left to perform the most simplest of tasks. Of course we were able to reschedule the switchover for a couple weeks later (we were always booked up for about 8 business days) and the customer finally had some intern go make the switch late one night. Who knows what was lost in money having to depend on their oversubscribed line for a few extra weeks (they were already paying for our highspeed connection since it was their fault the ball was dropped). Imagine having your business stuck on the equivalent of dialup while paying for a DSL connection that you can’t use. Except in this case the unused DSL was over 20 times the cost of the connection you were stuck maintaining.
For me that highlighted one problem with sending all knowledgeable people away from the company; the next big problem is the communication barrier and how it affects your service.
There are many extremely good bilingual techs outside of the US but there a lot more who don’t have a great of hang of the language; and since you’re already outsourcing so you can pay people a fraction of the low US wages chances are you’re going to get the ones with language issues. Dell computers has been struggling with this issue directly from many years.
When I first got a Dell laptop about a decade ago Dell was lauded for some of the greatest customer service in the business. I even had to deal with them a few times myself and had nothing but praise about my experience. But then around 2002-2003 they outsourced everything to India and their reputation for service dropped to the bottom and Dell customer service became a joke synonymous with uselessness. I found it was better to ignore the call in service and use email or just try fixing the issue myself without their help.
For my job moving to Slovakia I already know our US onsite techs, the guys that actually physically install the equipment, are already dreading the changeover. I was told many times how relieved a tech was to be talking to somebody in the US and how it was much easier to get a job done in 30 minutes through good communication rather than taking 2 hours to struggle through a foreign language barrier.
Most companies don’t see these secondary issues that arise. Sure it looks great on paper that you’re spending $20,000 less a week on online techs, but they don’t notice that the pay for onsite techs jumps up $20,000 because they have to be onsite longer for a job that isn’t done as quickly. You also can’t resolve customer issues as quickly, there may be greater frustration on the customer’s end with the delays and that can cost a contract. And just one lost contract of the size we dealt with would immediately counter any monetary gain from outsourcing.
Lost customer loyalty can be so bad that many businesses that outsourced in the early 2000’s have already brought back their customer facing divisions so that when customer’s call in they get somebody speaking the same language.
The sad reality is that it will continue to happen and there is nothing any of us involved can do; it’s not even a South Park “They took our jobs!” issue. Businesses go where they can charge less to boost profits and hopefully pass savings onto the customer. It’s be nice if customers rose up and fought back against the trend of outsourcing but the lure of just slightly cheaper prices are too great.
And so I move to a different job, hopefully it will be a while before that one is sent overseas.
BTW: I know the guys in Slovakia are good guys, the picture at the top is a joke.
I thought the iPad was officially going on sale on Saturday April 3rd (unless you’re a big enough news outlet to get a preview model the day before yesterday.
So how come Stephen Fry already got one? Looks like he just picked it up at the Apple store too.
The New Adventures of Mr Stephen Fry
He did interview Steve Jobs so maybe this was a parting gift for such a nice interview. If you’re in the US and you want it you’re going to have to wait another 2 days.
Wozniak in a recent interview:
Woz: By the way, I solved the problem of battery life and [the lack of] multitasking on the iPhone.
Woz: Yeah. I just have two iPhones, so if the battery runs down on the first one, I can use the other. And if I’m talking on one, I can use the other one to look something up. You would not believe how much use I get out of that.
That’s not really solving the multitasking problem. If this was anybody else I’m think it was a joke and they were making fun of the fact that the iPhone is only half capable. As it is I think he’s partially joking anyway but it still highlights an unsolved problem.
Not all of us want to carry two phones with their own service plans to solve the problem multitasking. AT&T even made one of their lame commercials making fun of that. As out of touch as they are, if even they think it’s a joke then it’s probably pretty bad.
My phone is 2 years old now but I’ve already solved Wozniak’s problem with one device. First off I didn’t get an iPhone. Second I got a bluetooth headset so I can talk while I hold the phone in my hands and make notes.
…Relax, I’m not a douche that wears the headset all the time. I only pull it out of my pocket to answer phone calls. Kind of like most people do with a regular phone only I don’t have to hold a brick to the side of my head.
Wozniak says he’s getting two iPads as well; not sure if that’s also to solve a multitasking issue.
Although it’s the German version It seems my opinion of punk music is closely related enough to Street Dogs and Last.fm that Bing’s algorithms tie the pages together.
I frankly find this hard to believe (I’m awesome but not that awesome) and suspect that Bing must have some tracker cookie influencing their results to make them a bit more customized to the place I goto. But even if THAT is true it’s still amazing that Bing could make such an amazing connection.
Still maybe I am influential enough that I’m front page news for Bing’s search.
How come HDTVs aren’t smart enough to auto crop themselves to the right size? Nearly every TV I see has problems with adjusting to the correct screen resolution to fit the current program or commercial.
Obviously the problem comes from the fact that some programming is in 4:3 (standard) and some is in 16:9 (widescreen). Old programming will always be 4:3 even if everybody gets widescreens, but as long as the height is the same this isn’t a problem there will just be black on the left and right of the image like vertical letterboxing. The problem is when widescreen programming is converted to 4:3 then played on a 16:9 screen; usually some local station is still doing this to appease people who don’t have widescreen TVs yet. Now you have letterboxing all around.
These days TVs should be able to recognize that 40% of the screen is black and zoom into the screen at the center.
Instead they have the option to “zoom” on the remote. This is nice but the cropping always happens in the same place so even the simplest software would recognize it. It seems that it should do this on the fly for you and the zoom button should just be the option to turn it off if it’s malfunctioning.
Some TVs have this auto crop feature but the dang thing never works so you’re always left to zoom anyway. This is probably because the black regions have slightly different levels of contrast but why not put a slider where you can adjust the sensitivity of black detection. Again even the simplest software can tell the difference between pixels that stay the same level of 95% black for the whole program and moving active color pixels.
Wow usually I’m just blogging on things that I’ve read at other places, this is news I actually discovered on my own!!
It stems from the fact that I always go through a certain process to prep my new music to be added to my music library. I did a full write up on the process at last.fm a few years ago; one of these day’s I’ll copy and upload it here. Basically I tag using Musicbrainz to get the right album, track, and artist tags. Then move it into a holding folder where I listen to it, put an appropriate genre and sub-genre, and a rating.
The problem is that when you edit the ID3 information in WMP12 it rewrites the ID3 header and corrupts all the non-standard ID3 tags on the MP3.
The metadata for MP3s are held in an ID3 header that is slapped onto the front of the file. All the artist, track, and album names are contained in here, including album art files if you want them. In addition to the standard ID3 tags there is the ability for third party companies to append their own tags into the system. Many programs put their own data in these non-standard tags, sometimes to put their own unique identifier tags on the file or with some they can actually put data that can be read by a player to dynamically change the songs as its played (typical with programs that change the track gain).
However many programs abuse the ID3 header. The worst I encountered was an expensive piece of DJ software that calculated the Beats Per Minute and put this info into the header so their software could quickly synch tracks. The problems is that it bullied the other tags, removing them and replacing all data with its own as if it were the only music player you’d ever need.
Luckily after corrupting my whole music library I was able to reload from a backup then get my money refunded for their bogus software. That also lead to my current methods of tagging and storing music; when you have over 40k songs all individually tagged and rated over the last 13 years you don’t want all that info wiped out by a poorly designed piece of software.
All through WMP9 to 11 there was no problem with the way Windows deals with ID3 tags. If you made a change to a tag it would edit just that tag and leave everybody else’s tags alone. But something changed with the new WMP version that is included in Windows 7 x64 (probably x32 as well).
Because now simply adding a rating in Windows Media Player in Windows 7 will screw up all the non-standard tags for the file, in my case erasing the Musicbrainz unique identifier code so it’s a pain in the ass to update tags later down the line.
I used MP3Tag to view the ID3 tags. The happens for ANY ID3 change, it doesn’t matter if you change the rating or the track name which is a standard ID3 field.
After some back and forth with Microsoft support I was told that this is a known issue and that hopefully if will be fixed in the future (I’m not holding my breath). And as Microsoft is promoting it’s Zune player and Media Center more heavily I don’t think WMP will be much of a priority. Too bad too, because I really like Windows Media Player better than iTunes, WinAmp, and foobar.
And finally to rub salt into the wound Musicbrainz currently has a bug where it erases all ID3 tags when it writes its own regardless of it’s setting to disable “erase existing tags”. Musicbrainz caught and reported the bug a year ago but new revisions come about as often as Windows media Player is updated. So you can’t work around the problem by doing one before the other. If you tag in Musicbrainz the WMP rating is gone. If you put a rating in WMP then the Musicbrainz UID is gone.
Hopefully one or the other bugs will soon be fixed and the problem eliminated but if you make any tag changes in Windows Media Player be aware that and special third party applications that use customer tags may have their data lost.
Update: Link to Musicbrainz bug report added. Thanks for the reminder!
This old ad I saw from Wired Reread reminded me of a recent discovery I made.
I found out that the really nice Technology Park I work at used to be WordPerfect Headquarters.
Old timers know the significance but new computer users may not. WordPerfect was the killer app to have on an IBM PC computer back in the 80’s and 90’s. Along with Quattro Pro they made up the dominant office applications that made a PC worth owning (besides SkiFree).
However the 800lb gorilla, Microsoft, bundled a group of office programs in a suite that had limited interoperability between them. While initially an inferior product the simple interaction between word docs and spreadsheets was too powerful of a feature and WordPerfect and Quattro Pro fell before the Microsoft Juggernaut.
Today the owners of WP spend their immense wealth on a giant arboretum/garden/golf course at the north end of Utah Valley. The office park was sold and now the building are leased out to other tech companies. AT&T where I’m soon to be unemployed, Omniture which was recently acquired by Adobe, Bungee Labs, Intuit, Open Solutions, Orange Soda, and some other startups.
From a tech history perspective it’s kind of amazing being among the “ruins” of a company that ran a virtual monopoly on the tech landscape in the heyday of the PC.
Imagine working at 1 Infinite Loop Cupertino 20 years in the future at the GoogleSkynet building, data-mining personal information to fuel the US-China Ad War, and spending your lunch break reminiscing about when people here were so worked up over a phone that could surf the internet.
In today’s high-paced tech world nobody can remember back 3 years let alone 20-30 years. But it’s good to remind yourself from time to time that the latest tablet craze, or Google’s latest move into *blank* technology is just a passing moment that will be forgotten in less than half a generation. And in the future people will look back amused at us just like we do when we think back to a day when a simple word-processing application would be the main reason people got desktop computers.
Well the fun’s over, Windows
Mobile Phone 7 is officially off my list for a future phone.
Basically it’s been confirmed that it won’t have copy and paste or multitasking.
I mentioned a a previous article that my biggest fear was that Microsoft would try so hard to copy the iPhone that they’d copy all the worst parts of the iPhone and that seems to be the way it’s going. Even having a slick new Zune inspired interface can’t save a phone that doesn’t have basic functionality. It’s why I hate the iPhone and I’m definitely not going to change my tune just because it’s Microsoft that is now screwing itself over.
The ironic thing is that I’ve been bagging on iPhone for lack of copy/paste and multitasking since day one. It took 3 years to get C/P and is rumored to get Multitasking now in it’s 4th year. All the whole WinMo had been rocking all that since about 2001.
Now Microsoft is regressing and backing to a state of suck that even the iPhone had finally cleared.
Oh well, there’s still hope for Android. And Windows Mobile 6.5 could probably live in the HD2 for a couple years before being completely outdated.
And there’s the slim hope that something may change. Either MS will realize it’s errors, or the rumor of a second business phone OS will materialize (or fix WinMo 6.X).
No not that one. Although…
I’ve been looking for a new smartphone and I can’t deny the Nexus One looks sweet. Now Google says that they’ve got a new model that will run on AT&T’s 3G network (before it was just 2G).
I’m really more interested in the most recent Android build and would prefer it in a keyboard slider phone but a Nexus would be cool too. However as of now I’m still waiting until the Dell Mini 5 “Streak” comes out so I can see what it’s like in hand. If I can stand the size I might prefer that for my new Android phone.
Also I may stay with Windows Mobile, the new info coming from MIX about the WinMo 7 looks cool but I still like the open-ness of 6.5. So the HD2 and possibly TouchPro 3 may be my new phone.
Cool that the Nexus is in play but I’m still on the fence.
A few people were down on Cisco for promising the “Next Generation” of the internet and then just releasing a new router a few days ago.
I’ll skip over the fact that EVERY manufacturer claims their gear is Revolutionary, Game Changing, or the Next Generation; I mean the iPad is just a big iPhone but apparently it’s “Magic”. For me magic is then a beautiful girl comes out of a genie bottle, calls me “Master”, crosses her arms and bobs her head, and creates a huge feast out of thin air.
So Cisco didn’t just create the next generation of the internet. But they built the device that can handle the throughput for the next generation of the internet and that’s just as important. Bullet trains may be the “next generation” of rail travel but without rails it’s just an expensive, immobile, hunk of metal. Cisco makes up about 86% of the internet routing devices, so when they see big bumps that means all of the web benefits.
It’s easy for casual home internet users not to realize how important the backend of the internet is but that’s only because the internet has never run out of bandwidth. Can you imagine what it would be like if your home DSL connection only ran at dial up speeds from 2pm-8pm because the net was overloaded with Hulu streams?
Luckily backend technologies are keeping well ahead of current demand and this is the moment when potential expanded to three times its size. Next Generation it may not be but this is still quite and accomplishment.
The CRS system devices are powerful on their own but their big claim to fame when originally developed was the ability to cluster the devices to create one super router. Through this clustering a single location can have the theoretical routing throughput of 322 Terabits of information. To put that in perspective as Cisco states, 322 Tbps is equivalent to transfer the entire printed collection of the Library of Congress in just over one second; every man, woman and child in China to make a video call, simultaneously; and every motion picture ever created to be streamed in less than four minutes.
Already Cisco and AT&T (ironically both my last 2 employers) are researching to put the new tech to use to create “thicker” backbones. AT&T owns most of the backbone connections that link regional carriers and even most of the undersea links coming into and out of the US. Unlike the latest smartphone, tablet, or laptop release event, or any of the new up and coming websites at SXSWi this development by Cisco will directly impact you and you life.