The future of Xbox is to copy Steam Machines

More to the point, the future is licensing.

As Microsoft has shown time and time again, they know how to observe a good idea and copy it. This tenant is core to the way Microsoft operates since its inception when Bill Gates used what he learned from working on Macintosh OS and replicated the idea for IBM machines running DOS. While the company has grown, so did its number of operating systems for different devices. PCs had their unique OS. However, so did Mobiles, Embedded components, Servers, MP3 players, game consoles, and more. But now – as CEO Satya Nadella sees the unification of OS code – the Windows 10 model returns Microsoft to a position of being able to offer that open-market choice for hardware manufacturers.

As the Xbox will be running Windows 10, surely this business will undergo a transformation. Xbox President Phil Spencer continues to emphasize the blurring of Xbox & PC gaming. It is widely accepted that the Xbox One’s hardware specs are well behind Sony’s PlayStation, and it is also understood that Microsoft was making them for nominal profit (although that figure comes from when the Kinect was included). As game developers become more adept at squeezing the most out of available tools for these nearly two-years-on-the-market consoles, the gap in hardware capacity is becoming more and more evident. Microsoft will be looking to avoid this embarrassment again in the future. Ultimately, the best way for them to indemnify themselves from blame is to share the responsibility.

Innovative gaming company Valve will be releasing its first Steam Machine in November of this year. Steam Machines are essentially computers running Valve’s specialized Linux distribution, SteamOS. These machines will require manufacturers to submit hardware configurations to Valve for testing and certification in order to receive the Steam Machine licensing. Valve’s idea is stupendous. It’s brilliant. It’s marvelous. It gives the consumer choice as to just how much he or she is willing to spend. Does the customer want a $500 box or a $2000 box? Complaints are minimized because it’s not a one-size-fits-all approach. It provides Valve with licensing revenue. But alas, the business model itself isn’t patentable. And Microsoft will copy it for Xbox.

The embarrassment that was and is the Xbox One will never be repeated. Xbox gaming hardware won’t be second best again. A wealth of third-party hardware manufacturers will take the helm as new Windows SKUs will read something like Windows TV, Windows Xbox, or Windows Gaming. There will likely exist between 3-5 grades for these systems which will determine what settings, and even what games can run at what grade system. Questions do exist though. Will Xbox gamers be able to build their own hardware configurations for these SKUs? If not, are these machines intended to be user-upgradeable? Is it possible to swap graphics cards? Can RAM be increased? Or will one be forced to buy a new “Xbox” in order to improve the hardware? What other Windows features will be available within the OS? When will this change occur?

These questions are clearly being worked out. But whatever happens, the change will be good for Microsoft. They save themselves the trouble of R&D, manufacturing logistics, retail agreements, shipping arrangements, international-body political tribunals, supply chain creation, hardware support, along with other facets of selling and building both hardware and software. Thousands of jobs will surely be cut. There is also the matter of new revenue streams from collecting licensing, new subscription fees from users, accessories, and a constant stream of up-to-date hardware choices for the Xbox ecosystem. With attribution to Jon Stewart’s daily show moniker, Well played Mr. Nadella.

4K TVs are here… but they’re already obsolete

Currently, an active “war” is in progress… one which mainstream technology blogs have dubiously neglected to detail. This ongoing conflict could be construed as a misinformation campaign in what should be named “Battle for the 4K AV Port” This contentious issue will change what pegs are located on back of the TV in a few years. And if you’re thinking about waiting to purchase that new 4K TV, please continue waiting.

Just a short while ago, some may remember we were using analog white, red, and yellow RCA cables to transmit audio & video. Several intermediary steps have propagated video technology forward including VGA, S-Video, YPBPR, DVI, amongst many others. It is widely accepted that HDMI is the current standard for Audio & Video (AV) connections. But now, with the advent of 4K mainstream adoption, a new generation of port is needed to push the higher bandwidth accompanying the increase in pixels. There exist several competing standards at the moment and it would be ignorant to believe they can all win.


The currently dominant port – HDMI – was founded by several large players in the TV manufacturing and AV components industry including Hitachi, Panasonic, Philips, Sony, Toshiba, Technicolor, and Silicon Image. This group, the HDMI Consortium, licenses the use of HDMI for several fees. There are annual membership dues, along with port fees. In other words, when manufacturer puts an HDMI port on a device, the manufacturer must pay the Consortium a royalty between .04 and .15 USD. When considering the staggering number of devices with multiple HDMI ports, it can be imagined that is a very lucrative business. One of the big problems for HDMI is that their current revision HDMI 2.0, is still very far behind the competition in terms of transmittable bandwidth. Further damning is the slow uptick in adoption by computer and graphics cards manufacturers.

The USB Implementers Forum, the group responsible for licensing USB technology adhere to a similar pricing scheme. The founding members of the USB Implementers Forum comprises Compaq, DEC, IBM, Intel, Microsoft, NEC, and Nortel. USB was originally planned for communication between computing devices. However, in the most current revision, USB 3.1 Type C, the bandwidth and power a USB cable can carry place it as a viable contender to HDMI. This group of computing giants can hold their own in potentially supplanting some of the biggest AV component suppliers in the HDMI Consortium.

Lastly, we have DisplayPort. This technology sprouted from VESA, who was founded by NEC, ATI, Western Digital, and others. DisplayPort charges .20 USD per port. When it was launched in 2008, DisplayPort was magnitudes ahead of the competition. It could carry much greater bandwidth than HDMI at a fraction of the size. Both cables and connectors are small. But alas, USB corrected these issues in their most recent revision. And DisplayPort signals can now be carried over USB 3.1 Type C connections. It is hard to imagine VESA themselves finding a chair at the table.


Although highly unlikely, it is possible for another group to usurp the incumbents. Much has changed in the electronics manufacturing landscape since these cabling technologies were formed decades ago. The power dynamic is distributed to companies who were not as prevalent in the 90s and 00s. Currently, the two biggest producers of TV screens are LG and Samsung, South Korean multinational conglomerates. They are absent from these ruling bodies. Apple can easily be considered the premium device manufacturer at the moment, and their one-man 30-pin and Lightning ports were not adopted by the industry. Google has the largest presence in mobile. Qualcomm is the marquee ARM manufacturer at the moment. AMD is certainly a player, providing the APUs for both Microsoft and Sony’s gaming consoles. NVidia is a powerhouse in the GPU industry. A bevy of other companies would surely sign up if invited.

The fight for the new standard port is dynamic. Unfortunately, it is deprived of the publication it surely deserves.

Destructive Environments: Up in the Cloud

During Microsoft’s Build 2014 conference, one of many interesting projects demoed was a prototype game using both local and server resources. Displayed onscreen was, the intricate dismantling of a multi-story building. When one glass pane was destroyed by the operator’s industrial laser weapon, it would shatter into hundreds of pieces. These chunks of concrete would then descend through gravity before shattering into hundreds of additional shards. The demo, later revealed to be a very early build of the next Crackdown title on Xbox One, was demonstrated primarily to showcase one way whereby the Xbox One might supersede its competition on the graphics front. The disparity in performance between the DDR3 memory of Microsoft’s console pitted against the GDDR5 memory in Sony’s PlayStation 4 has been covered in the games media ad nauseam. But having just wrapped up Build 2015, Microsoft has yet to release a title for its Xbox platform with the demoed tech.

Prototypes can sometimes be misleading. So skepticism regarding Microsoft’s claims is understandable. But in this case, the skepticism can be readily dismissed as the enmeshing of local and server resources for gaming purposes has already been accomplished. Titanfall, EA’s multiplayer-only game, delegated real-time enemy AI to an Azure server. NPC spawning, movements, and clean-up was computed “in the cloud” & then pushed to players in real-time. Most importantly, Titanfall executed well. The game performs as expected.

While AI & destructible environments are two different components in a game engine, they each share underlying code. Spatial positioning, placement, and movement are traits both components share. The offloading of these computational intensive tasks to a more powerful server, which then distributes the values for anyone connected to that particular game session, is again a distributable concept.

This technology was examined over a year ago and has yet to show up again in any publicly available games – alphas, betas, or otherwise. So where is it? Perhaps it is in HoloLand, a narrow place where $2 billion Minecraft and $4 billion Nokia acquisitions reside.

Source: Polygon

6/18/15 UPDATE: E3 2015 has just concluded and this graphics enhancing technology is still MIA.

8/04/15 UPDATE 2: Crackdown 3 was finally revealed at Gamescom 2015 with this tech in tow.

Portable Software on OneDrive

PC users running portable freeware has for several years existed as a sub-culture. The PortableApps platform was launched in 2006. The platform was built around the idea of portability. The notion of being able carry around your programs and files, and being able to access them from any Windows PC, wherever you were was an appealing concept. Remember, this was well before the iPhone’s arrival, mobile OSes, & cloud services becoming a bandwidth viable reality for consumers. Also, the “Apps” here is short for software, not to be confused with Apple’s redefinition of touch applications when they launched the App Store in 2008.

Purchased mobile “apps” from stores are now tied that respective account. Personal storage of files has been addressed by cloud services with the likes of Dropbox, Onedrive, Google Drive, and Box. So all is good and well, right? Wrong. Arguments about the viability of mobile OSes being used for productivity can be found in many places, and will not be covered in this post with one exclusion. The sole point to be mentioned is that despite mobile apps being simplistic, they certainly aren’t as feature-rich or robust as desktop software.

For desktop productivity users, the obvious resolution is to combine the benefits of cloud storage with portable software, having a Portable “Program Files” directory readily available. In theory, perfectly logical. It does work. However, it comes at the cost of computer resources.

For instance, I run clipboard manager software, called Ditto. It saves and journals anytime an image or text is cut or copied. It runs when the computer starts up. This program, located within my OneDrive folder on Windows runs in the background. It writes changes to the main database file during instances of copy & paste. The program itself is minimal, coming in at about 2 M. But the database it creates gets large…quickly. Right now, mine is at 30 MB after clearing it just a month ago. Anytime a change is made, the OneDrive synchronization process eats up to 30% of this dual core CPU. The “unsynchronized” file is then uploaded, slowing down network activity. Then, Windows Search indexing begins running, gobbling up to 25% of the notebook’s total CPU.


If this were to happen once or twice a month, that would be fine. But it is a regular occurrence. Ditto is the most pervasive program, but other examples include:

PicPick – graphics tools
MusicBee – music manager and player
Pazera – quickest audio/video converters

Of course, OneDrive – a built-in process on Windows 10 – isn’t the only option. One could install software like DropBox or Google Drive which have sync frequency options. These run in the background and effectively use up additional computer resources, creating the same superfluous use of CPU cycles. Although highly improbable, Microsoft can address this issue by setting a folder within OneDrive to synchronize at much more infrequent rate from the other persistently synced folders. Much like a DMZ is for a router, this folder could be one with its own rules.

Blending desktop software with the new storage mediums like OneDrive is currently possible, but not without unnecessary use of CPU resources. It is questionable whether we’ll receive more advanced options as these storage technologies continue maturing and connectivity solutions improve.

Cheese Melting Properties: Infographic

Someone posted an infographic on Reddit about different cheese types and how they melt. The graphical elements and text were way too small to attach on a fridge:\

After some searching, it turned out that the post on Reddit was a bang-up copy-paste remake of an even more outdated infographic found on

Original & Remake

Discontent with both, I spent a few hours making a new one.

Cheese Melt

The image is configured to print on legal size paper. Enjoy the fromage!

The best phone for senior citizens, the elderly, and old people

Currently, the best phone on market for people born before 1945 is a feature flip phone from Alcatel, the OneTouch 2010. There exist several reasons why.

For starters, it’s a flip phone. The mechanical action of bidirectional flipping is a concept anyone who has used a light switch is familiar with. The implications of each action are observable in the real world. Opening the phone initiates a session where the phone is active. Closing the phone concludes that session. Some psychologists suggest that hearing the noise of the clasp as it closes reinforces the perception of the beginning and ending of use. These ideas are tangible and easy to understand, but they only explain how your relative can answer a call. What if they want to call you?

The answer is speed dial. After manually programming in your relative’s phone numbers, the most important numbers can be assigned to speed dial. Laminated phone charts can be helpful in aiding your elderly relative. This chart would ideally have large text, showing names followed by their speed dial assignment within the phone. With speed dial set up, your relative can have 9 potential people which which can easily contacted. Speed dial instructions vary, but most involve opening the phone, then pressing and holding the corresponding speed dial digit.

This is where the Alcatel 2010 excels. Aesthetic-wise, this flip phone is minimalist, simple, and elegant. This design looks great, but also serves in drastically enhancing function. The keypad contains only the necessary buttons for making calls. The 2010’s reduction of buttons is complimented by an excellent engineering choice which uses an edge-to-edge design for the keypad. This design creates additional surface area, further increasing the size of buttons. This design makes possible the largest in-class hit boxes for shaky fingers to connect with. Plain and simple, the keypad is the sole determiner of what makes this device the go to option for elderly persons. Its appearance is easily interpretable for their eyes and minds, and its buttons are adequately large for their fingers to press.

In contrast, nearly every other flip phone on the market tack on superfluous buttons. 2 MP rear-cameras, ancient web-browsers, and dedicated Facebook buttons are among the “features” that any person who is using a flip-phone 2014 doesn’t need and will never use. These additional buttons take up space on an already cramped surface area. They also confuse elderly people (aka people who use a flip phone in 2014).


Praises acknowledged, there are areas where the 2010 can improve. The screen is rather small at 2.4 inches or 61 mm, measured diagonally. It would be nice to see Alcatel incorporate an edge-to-edge design for the screen in the 2010’s successor, regardless of screen resolution. As for the speaker and microphone, this US writer cannot say. The most important improvement which Alcatel can make gains on is availability.

The 2010 is sold on several carriers in several markets across the planet. To name a few, Vodafone UK & O2 Germany are two carriers who provide this device. But if you and your relative reside within the United States, tough luck. The phones themselves cost $45, taxes included, in England and Germany respectively. That price is for a new device with full one-year warranty. But for US consumers, these phones can only be purchased used on eBay for $100 + 20 international shipping. Those prices exclude the warranty. But even were the warranty included, Alcatel’s authorized US repair center is prohibited from servicing the device.

Feeling blue? Wait just a second…

Some potentially hopeful news was recently made public. According to a recent interview conducted by the Wall Street Journal, Alcatel OneTouch Chief Marketing Officer Dan Dery discussed the company’s plans for 2015. He states Alcatel’s intentions of making a push towards directly marketing to interested consumers. Company plans involve some form of e-commerce. His statements were very vague and ambiguous. But, were a US storefront to materialize, there exists zero guarantee that the 2010 would even be available.

For the sake of the elderly here in the United States, let’s hope that this possibility becomes an actuality.