Monthly Archives: August 2016

Tilera’s TILE-Gx72 Processor Sets World Record for Suricata IPS/IDS: Industry’s Highest Performance

Tilera® Corporation, the leader in 64-bit manycore general purpose processors, today announced it has achieved the highest ever single-chip Suricata performance, delivering 4x the performance, and 7x the performance-per-watt of a high-end x86 multicore processor. Suricata is the industry-leading open source Intrusion Detection and Prevention System (IDS/IPS) developed by the Open Information Security Foundation (OISF) and supported by the US Department of Homeland Security (DHS) to secure networks against next generation security attacks.

Sophisticated intrusion detection and prevention, such as Suricata implements, requires deep packet inspection and pattern-matching that taxes the capabilities of even the highest performance processors. The Suricata performance achieved on the TILE-Gx72 processor is double that of the previous record-holder, the TILE-Gx36, providing organizations with network security that scales with their networks.

“Tilera’s TILE-Gx processors are continuing to lead the market in Suricata performance and the impressive results with the TILE-Gx72 demonstrates the synergy between a massively manycore processor, coupled with Suricata’s multi-threaded implementation,” stated Matt Jonkman, president, OISF. “With the continuing rise of security threats and incidents, corporate enterprises, carriers and government organizations are adopting Suricata for their IDS/IPS and leveraging the TILE-Gx processor family coupled with the MDE development environment to achieve the best performance.”

The TILE-Gx72 is the world’s highest performance and highest efficiency processor with integrated System-on-Chip (SoC) features including eight 10Gb Ethernet ports, 24-lanes of PCI Express, four DDR3 memory controllers, and 23 Mbytes of on-chip cache. The wire-speed, programmable mPIPE front end processes 240 Mpps of bi-directional Ethernet traffic and improves the efficiency of network-heavy applications. With its exceptionally low power profile, several TILE-Gx72 processors can be populated in a single compact datacenter appliance, providing 576 cores of compute and 640Gbps of packet processing with 8 sockets.

The multi-threaded Suricata IDS/IPS application, version 1.4.0, was ported using Tilera’s Multicore Development Environment (MDE) version 4.1, a full-featured and standards based run-time Linux environment for TILE-Gx processors. The recent “live rule swap” update supports dynamic insertion of new threat signatures into Suricata and enables rapid response to threats such as Zero-Day Attacks.

“We track Moore’s Law with the tile-based architecture and significantly raise the bar with our TILE-Gx72 processor, incorporating twice the number of cores of our previous high-end processor. Tilera’s high-performance iMesh interconnect enables Suricata performance to scale linearly with the additional cores,” said Devesh Garg, president, and CEO of Tilera. “Once again, we are demonstrating that the TILE-Gx architecture provides a real-world advantage in scalable application performance, power efficiency, and overall compute density.”

The Suricata solution is available on all of Tilera’s TILE-Gx platforms, ranging from the TILEncore-Gx series of PCIe cards with multiple 10Gbps Ethernet interfaces, to the TILEmpower-Gx 1RU standalone appliance and the TILExtreme-Gx 1RU multi-socket platform with up to 288 cores of compute.

The Evolution of Direct3D

* UPDATE: Be sure to read the comment thread at the end of this blog, the discussion got interesting.

It’s been many years since I worked on Direct3D and over the years the technology has evolved Dramatically. Modern GPU hardware has changed tremendously over the years Achieving processing power and capabilities way beyond anything I dreamed of having access to in my lifetime. The evolution of the modern GPU is the result of many fascinating market forces but the one I know best and find most interesting was the influence that Direct3D had on the new generation GPU’s that support Welcome to Thunderbird processing cores, billions of transistors more than the host CPU and are many times faster at most applications. I’ve told a lot of funny stories about how political and Direct3D was created but I would like to document some of the history of how the Direct3D architecture came about and the architecture that had profound influence on modern consumer GPU’s.

Published here with this article is the original documentation for Direct3D DirectX 2 when it was first Introduced in 1995. Contained in this document is an architecture vision for 3D hardware acceleration that was largely responsible for shaping the modern GPU into the incredibly powerful, increasingly ubiquitous consumer general purpose supercomputers we see today.

D3DOVER
The reason I got into computer graphics was NOT an interest in gaming, it was an interest in computational simulation of physics. I Studied 3D at Siggraph conferences in the late 1980’s Because I wanted to understand how to approach simulating quantum mechanics, chemistry and biological systems computationally. Simulating light interactions with materials was all the rage at Siggraph back then so I learned 3D. Understanding light 3D mathematics and physics made me a graphics and color expert roomates got me a career in the publishing industry early on creating PostScript RIP’s (Raster Image Processors). I worked with a team of engineers in Cambridge England creating software solutions for printing color graphics screened before the invention of continuous tone printing. That expertise got me recruited by Microsoft in the early 1990’s to re-design the Windows 95 and Windows NT print architecture to be more competitive with Apple’s superior capabilities at that time. My career came full circle back to 3D when, an initiative I started with a few friends to re-design the Windows graphics and media architecture (DirectX) to support real-time gaming and video applications, resulted in gaming becoming hugely strategic to Microsoft. Sony Introduced in a consumer 3D game console (the Playstation 1) and being responsible for DirectX it was incumbent on us to find a 3D solution for Windows as well.

For me, the challenge in formulating a strategy for consumer 3D gaming for Microsoft was an economic one. What approach to consumer 3D Microsoft should take to create a vibrant competitive market for consumer 3D hardware that was both affordable to consumers AND future proof? The complexity of realistically simulating 3D graphics in real time was so far beyond our capabilities in that era that there was NO hope of choosing a solution that was anything short of an ugly hack that would produce “good enough” for 3D games while being very far removed from the ideal solutions mathematically we had implemented a little hope of seeing in the real-world during our careers.

Up until that point only commercial solutions for 3D hardware were for CAD (Computer Aided Design) applications. These solutions worked fine for people who could afford hundred thousand dollars work stations. Although the OpenGL API was the only “standard” for 3D API’s that the market had, it had not been designed with video game applications in mind. For example, texture mapping, an essential technique for producing realistic graphics was not a priority for CAD models roomates needed to be functional, not look cool. Rich dynamic lighting was also important to games but not as important to CAD applications. High precision was far more important to CAD applications than gaming. Most importantly OpenGL was not designed for highly interactive real-time graphics that used off-screen video page buffering to avoid tearing artifacts during rendering. It was not that the OpenGL API could not be adapted to handle these features for gaming, simply that it’s actual market implementation on expensive workstations did not suggest any elegant path to a $ 200 consumer gaming cards.

TRPS15In the early 1990’s computer RAM was very expensive, as such, early 3D consumer hardware designs optimized for minimal RAM requirements. The Sony Playstation 1 optimized for this problem by using a 3D hardware solution that did not rely on a memory intensive the data structure called a Z-buffer, instead they used a polygon level sorting algorithm that produced ugly intersections between moving joints. The “Painters Algorithm” approach to 3D was very fast and required little RAM. It was an ugly but pragmatic approach for gaming that would have been utterly unacceptable for CAD applications.

In formulating the architecture for Direct3D we were faced with difficult choices Similar enumerable. We wanted the Windows graphics leading vendors of the time; ATI, Cirrus, Trident, S3, Matrox and many others to be Able to Compete with one another for rapid innovation in 3D hardware market without creating utter chaos. The technical solution that Microsoft’s OpenGL team espoused via Michael Abrash was a driver called 3DDDI models (3D Device Driver Interface). 3DDDI was a very simple model of a flat driver that just supported the hardware acceleration of 3D rasterization. The complex mathematics associated with transforming and lighting a 3D scene were left to the CPU. 3DDDI used “capability bits” to specify additional hardware rendering features (like filtering) that consumer graphics card makers could optionally implement. The problem with 3DDDI was that it invited problems for game developers out of the gate. There were so many cap-bits every game that would either have to support an innumerable number of feature combinations unspecified hardware to take advantage of every possible way that hardware vendors might choose to design their chips producing an untestable number of possible hardware configurations and a consumer huge amount of redundant art assets that the games would not have to lug around to look good on any given device OR games would revert to using a simple set of common 3D features supported by everyone and there would be NO competitive advantage for companies to support new hardware 3D capabilities that did not have instant market penetration. The OpenGL crowd at Microsoft did not see this as a big problem in their world Because everyone just bought a $ 100,000 workstation that supported everything they needed.

The realization that we could not get what we needed from the OpenGL team was one of the primary could be better we Decided to create a NEW 3D API just for gaming. It had nothing to do with the API, but with the driver architecture underneath Because we needed to create a competitive market that did not result in chaos. In this respect the Direct3D API was not an alternative to the OpenGL API, it was a driver API designed for the sole economic purpose of creating a competitive market for 3D consumer hardware. In other words, the Direct3D API was not shaped by “technical” requirements so much as economic ones. In this respect the Direct3D API was revolutionary in several interesting ways that had nothing to do with the API itself but rather the driver architecture it would rely on.

When we Decided to acquire a 3D team to build with Direct3D I was chartered surveying the market for candidate companies with the right expertise to help us build the API we needed. As I have previously recounted we looked at Epic Games (creators of the Unreal engine), Criterion (later acquired by EA), Argonaut and finally Rendermorphics. We chose Rendermorphics (based in London) Because of the large number of 3D quality engineers and the company employed Because The founder, Servan Kiondijian, had a very clear vision of how consumer 3D drivers should be designed for maximum future compatibility and innovation. The first implementation of the Direct3D API was rudimentary but quickly intervening evolved towards something with much greater future potential.

D3DOVER lhanded
Whoops!

My principal memory from that period was a meeting in roomates I, as the resident expert on the DirectX 3D team, was asked to choose a handedness for the Direct3D API. I chose a left handed coordinate system, in part out of personal preference. I remember it now Only because it was an arbitrary choice that by the caused no end of grief for years afterwards as all other graphics authoring tools Adopted the right handed coordinate system to the OpenGL standard. At the time nobody knew or believed that a CAD tool like Autodesk would evolve up to become the standard tool for authoring game graphics. Microsoft had acquired Softimage with the intention of displacing the Autodesk and Maya anyway. Whoops …

The early Direct3D HAL (Hardware Abstraction Layer) was designed in an interesting way. It was structured vertically into three stages.

DX 2 HAL

The highest was the most abstract layer transformation layer, the middle layer was dedicated to lighting calculations and the bottom layer was for rasterization of the finally transformed and lit polygons into depth sorted pixels. The idea behind this vertical structure driver was to provide a relatively rigid feature path for hardware vendors to innovate along. They could differentiate their products from one another by designing hardware that accelerated increasingly higher layers of the 3D pipeline resulting in greater performance and realism without incompatibilities or a sprawling matrix of configurations for games to test against art or requiring redundant assets. Since the Direct3D API created by Rendermorphics Provided a “pretty fast” implementation software for any functionality not accelerated by the hardware, game developers could focus on the Direct3D API without worrying about myriad permutations of incompatible hardware 3D capabilities. At least that was the theory. Unfortunately like the 3DDDI driver specification, Direct3D still included capability bits designed to enable hardware features that were not part of the vertical acceleration path. Although I actively objected to the tendency of Direct3D capability to accumulate bits, the team felt extraordinary competitive pressure from Microsoft’s own OpenGL group and from the hardware vendors to support them.

The hardware companies, seeking a competitive advantage for their own products, would threaten to support and promote OpenGL to game developers Because The OpenGL driver bits capability supported models that enabled them to create features for their hardware that nobody else supported. It was common (and still is) for the hardware OEM’s to pay game developers to adopt features of their hardware unique to their products but incompatible with the installed base of gaming hardware, forcing consumers to constantly upgrade their graphics card to play the latest PC games . Game developers alternately hated capability bits Because of their complexity and incompatibilities but wanted to take the marketing dollars from the hardware OEM’s to support “non-standard” 3D features.

Overall I viewed this dynamic as destructive to a healthy PC gaming economy and advocated resisting the trend OpenGL Regardless of what the people wanted or OEM’s. I believed that creating a consistent stable consumer market for PC games was more important than appeasing the hardware OEM’s. As such as I was a strong advocate of the relatively rigid vertical Direct3D pipeline and a proponent of introducing only API features that we expected up to become universal over time. I freely confess that this view implied significant constraints on innovation in other areas and a placed a high burden of market prescience on the Direct3D team.

The result, in my estimation, was pretty good. The Direct3D fixed function pipeline, as it was known, produced a very rich and growing PC gaming market with many healthy competitors through to DirectX 7.0 and the early 2000’s. The PC gaming market boomed and grew to be the largest gaming market on Earth. It also resulted in a very interesting change in the GPU hardware architecture over time.

Had the Direct3D HAL has been a flat driver with just the model for rasterization capability bits as the OpenGL team at Microsoft had advocated, 3D hardware makers would have competed by accelerating just the bottom layer of the 3D rendering pipeline and adding differentiating features to their hardware capability via bits that were incompatible with their competitors. The result of introducing the vertical layered architecture THING that was 3D hardware vendors were all encouraged to add features to their GPU’s more consistent with the general purpose CPU architectures, namely very fast floating point operations, in a consistent way. Thus consumer GPU’s evolved over the years to increasingly resemble general purpose CPU’s … with one major difference. Because the 3D fixed function pipeline was rigid, the Direct3D architecture afforded very little opportunity for code branching frequent as CPU’s are designed to optimize for. Achieved their GPU’s amazing performance and parallelism in part by being free to assume that little or no branching code would ever occur inside a Direct3D graphics pipeline. Thus instead of evolving one giant monolithic core CPU that has massive numbers of transistors dedicated to efficient branch prediction has as an Intel CPU, GPU has a Direct3D Hundreds to Welcome to Thunderbird simple CPU cores like that have no branch prediction. They can chew through a calculation at incredible speed confident in the knowledge that they will not be interrupted by code branching or random memory accesses to slow them down.

DirectX 7.0 up through the underlying parallelism of the GPU was hidden from the game. As far as the game was concerned some hardware was just faster than other hardware but the game should not have to worry about how or why. The early DirectX fixed function pipeline architecture had done a brilliant job of enabling dozens of Disparate competing hardware vendors to all take different approaches to Achieving superior cost and performance in consumer 3D without making a total mess of the PC gaming market for the game developers and consumers . It was not pretty and was not entirely executed with flawless precision but it worked well enough to create an extremely vibrant PC gaming market through to the early 2000’s.

Before I move on to discussing more modern evolution Direct3D, I would like to highlight a few other important ideas that influenced architecture in early modern Direct3D GPU’s. Recalling that in the early to mid 1990’s was relatively expensive RAM there was a lot of emphasis on consumer 3D techniques that conserved on RAM usage. The Talisman architecture roomates I have told many (well-deserved) derogatory stories about was highly influenced by this observation.

Talsiman
Search this blog for tags “Talisman” and “OpenGL” for many stories about the internal political battles over these technologies within Microsoft

Talisman relied on a grab bag of graphics “tricks” to minimize GPU RAM usage that were not very generalized. The Direct3D team, Rendermorphics Heavily influenced by the founders had made a difficult choice in philosophical approach to creating a mass market for consumer 3D graphics. We had Decided to go with a more general purpose Simpler approach to 3D that relied on a very memory intensive a data structure called a Z-buffer to Achieve great looking results. Rendermorphics had managed to Achieve very good 3D performance in pure software with a software Z-buffer in the engine Rendermorphics roomates had given us the confidence to take the bet to go with a more general purpose 3D Simpler API and driver models and trust that the hardware RAM market and prices would eventually catch up. Note however that at the time we were designing Direct3D that we did not know about the Microsoft Research Groups “secret” Talisman project, nor did they expect that a small group of evangelists would cook up a new 3D API standard for gaming and launch it before their own wacky initiative could be deployed. In short one of the big bets that Direct3D made was that the simplicity and elegance of Z-buffers to game development were worth the risk that consumer 3D hardware would struggle to affordably support them early on.

Despite the big bet on Z-buffer support we were intimately aware of two major limitations of the consumer PC architecture that needed to be addressed. The first was that the PC bus was generally very slow and second it was much slower to copy the data from a graphics card than it was to copy the data to a graphics card. What that generally meant was that our API design had to growing niche to send the data in the largest most compact packages possible up to the GPU for processing and absolutely minimize any need to copy the data back from the GPU for further processing on the CPU. This generally meant that the Direct3D API was optimized to package the data up and send it on a one-way trip. This was of course an unfortunate constraint Because there were many brilliant 3D effects that could be best accomplished by mixing the CPU’s branch prediction efficient and robust floating point support with the GPU’s parallel rendering incredible performance.

One of the fascinating Consequences of that constraint was that it forced the GPU’s up to become even more general purpose to compensate for the inability to share the data with the CPU efficiently. This was possibly the opposite of what Intel intended to happen with its limited bus performance, Because Intel was threatened by the idea that the auxiliary would offload more processing cards from their work thereby reducing the CPU’s Intel CPU’s value and central role to PC computing. It was reasonably believed at that time that Intel Deliberately dragged their feet on improving PC performance to deterministic bus a market for alternatives to their CPU’s for consumer media processing applications. Earlier Blogs from my recall that the main REASON for creating DirectX was to Prevent Intel from trying to virtualize all the Windows Media support on the CPU. Intel had Adopted a PC bus architecture that enabled extremely fast access to system RAM shared by auxiliary devices, it is less Likely GPU’s that would have evolved the relatively rich set of branching and floating point operations they support today.

To Overcome the fairly stringent performance limitations of the PC bus a great deal of thought was put into techniques for compressing and streamlining DirectX assets being sent to the GPU performance to minimize bus bandwidth limitations and the need for round trips from the GPU back to the CPU . The early need for the rigid 3D pipeline had Consequences interesting later on when we Began to explore assets streaming 3D over the Internet via modems.

We Recognized early on that support for compressed texture maps would Dramatically improve bus performance and reduce the amount of onboard RAM consumer GPU’s needed, the problem was that no standards Existed for 3D texture formats at the time and knowing how fast image compression technologies were evolving at the time I was loathe to impose a Microsoft specified one “prematurely” on the industry. To Overcome this problem we came up with the idea of ​​”blind compression formats”. The idea, roomates I believe was captured in one of the many DirectX patents that we filed, had the idea that a GPU could encode and decode image textures in an unspecified format but that the DirectX API’s would allow the application to read and write from them as though they were always raw bitmaps. The Direct3D driver would encode and decode the image data is as Necessary under the hood without the application needing to know about how it was actually being encoded on the hardware.

By 1998 3D chip makers had begun to devise good quality 3D texture formats by DirectX 6.0 such that we were Able to license one of them (from S3) for inclusion with Direct3D.

http://www.microsoft.com/en-us/news/press/1998/mar98/s3pr.aspx

DirectX 6.0 was actually the first version of DirectX that was included in a consumer OS release (Windows 98). Until that time, DirectX was actually just a family of libraries that were shipped by the Windows games that used them. DirectX was not actually a Windows API until five generations after its first release.

DirectX 7.0 was the last generation of DirectX that relied on the fixed function pipeline we had laid out in DirectX 2.0 with the first introduction of the Direct3D API. This was a very interesting transition period for Direct3D for several could be better;

1) The original founders DirectX team had all moved on,

2) Microsoft’s internal Talisman and could be better for supporting OpenGL had all passed

3) Microsoft had brought the game industry veterans like Seamus Blackley, Kevin Bacchus, Stuart Moulder and others into the company in senior roles.

4) Become a Gaming had a strategic focus for the company

DirectX 8.0 marked a fascinating transition for Direct3D Because with the death of Talisman and the loss of strategic interest in OpenGL 3D support many of the people from these groups came to work on Direct3D. Talisman, OpenGL and game industry veterans all came together to work on Direct3D 8.0. The result was very interesting. Looking back I freely concede that I would not have made the same set of choices that this group made for DirectX 8.0 in chi but it seems to me that everything worked out for the best anyway.

Direct3D 8.0 was influenced in several interesting ways by the market forces of the late 20th century. Microsoft largely unified against OpenGL and found itself competing with the Kronos Group standards committee to advance faster than OpenGL Direct3D. With the death of SGI, control of the OpenGL standard fell into the hands of the 3D hardware OEM’s who of course wanted to use the standard to enable them to create differentiating hardware features from their competitors and to force Microsoft to support 3D features they wanted to promote. The result was the Direct3D and OpenGL Became much more complex and they tended to converge during this period. There was a stagnation in 3D feature adoption by game developers from DirectX 8.0 to DirectX 11.0 through as a result of these changes. Became creating game engines so complex that the market also converged around a few leading search providers Including Epic’s Unreal Engine and the Quake engine from id software.

Had I been working on Direct3D at the time I would have stridently resisted letting the 3D chip lead Microsoft OEM’s around by the nose chasing OpenGL features instead of focusing on enabling game developers and a consistent quality consumer experience. I would have opposed introducing shader support in favor of trying to keep the Direct3D driver layer as vertically integrated as possible to Ensure conformity among hardware vendors feature. I also would have strongly opposed abandoning DirectDraw support as was done in Direct3D 8.0. The 3D guys got out of control and Decided that nobody should need pure 2D API’s once developers Adopted 3D, failing to recognize that simple 2D API’s enabled a tremendous range of features and ease of programming that the majority of developers who were not 3D geniuses could Easily understand and use. Forcing the market to learn 3D Dramatically constrained the set of people with the expertise to adopt it. Microsoft later discovered the error in this decision and re-Introduced DirectDraw as the Direct2D API. Basically letting the Direct3D 8.0 3D design geniuses made it brilliant, powerful and useless to average developers.

At the time that the DirectX 8.0 was being made I was starting my first company WildTangent Inc.. and Ceased to be closely INVOLVED with what was going on with DirectX features, however years later I was Able to get back to my roots and 3D took the time to learn Direct3D programming in DirectX 11.1. Looking back it’s interesting to see how the major architectural changes that were made in DirectX 8 resulted in the massively convoluted and nearly incomprehensible Direct3D API we see today. Remember the 3 stage pipeline DirectX 2 that separated Transformation, lighting and rendering pipeline into three basic stages? Here is a diagram of the modern DirectX 11.1 3D pipeline.

DX 11 Pipeline

Yes, it grew to 9 stages and 13 stages when arguably some of the optional sub-stages, like the compute shader, are included. Speaking as somebody with an extremely lengthy background in very low-level 3D graphics programming and I’m Embarrassed to confess that I struggled mightily to learn programming Direct3D 11.1. Become The API had very nearly incomprehensible and unlearnable. I have no idea how somebody without my extensive background in 3D and graphics could ever begin to learn how to program a modern 3D pipeline. As amazingly powerful and featureful as this pipeline is, it is also damn near unusable by any but a handful of the most antiquated brightest minds in 3D graphics. In the course of catching up on my Direct3D I found myself simultaneously in awe of the astounding power of modern GPU’s and where they were going and in shocked disgust at the absolute mess the 3D pipeline had Become. It was as though the Direct3D API had Become a dumping ground for 3D features that every OEM DEMANDED had over the years.

Had I not enjoyed the benefit of the decade long break from Direct3D involvement Undoubtedly I would have a long history of bitter blogs written about what a mess my predecessors had made of a great and elegant vision for the consumer 3D graphics. Weirdly, however, leaping forward in time to the present day, I am forced to admit that I’m not sure it was such a bad thing after all. The result of stagnation gaming on the PC as a result of the mess Microsoft and the OEMs made of the Direct3D API was a successful XBOX. Having a massively fragmented 3D API is not such a problem if there is only one hardware configuration to support game developers have, as is the case with a game console. Direct3D shader 8.0 support with early primitive was the basis for the first Xbox’s graphics API. For the first selected Microsoft’s XBOX NVIDIA NVIDIA chip giving a huge advantage in the 3D PC chip market. DirectX 9.0, with more advanced shader support, was the basis for the XBOX 360, Microsoft roomates selected for ATI to provide the 3D chip, AMD this time handing a huge advantage in the PC graphics market. In a sense the OEM’s had screwed Themselves. By successfully Influencing Microsoft and the OpenGL standards groups to adopt highly convoluted graphics pipelines to support all of their feature sets, they had forced Themselves to generalize their GPU architectures and the 3D chip market consolidated around a 3D chip architecture … whatever Microsoft selected for its consoles.

The net result was that the retail PC game market largely died. It was simply too costly, too insecure and too unstable a platform for publishing high production value games on any longer, with the partial exception of MMOG’s. Microsoft and the OEM’s had conspired together to kill the proverbial golden goose. No biggie for Microsoft as they were happy to gain complete control of the former PC gaming business by virtue of controlling the XBOX.

From the standpoint of the early DirectX vision, I would have said that this outcome was a foolish, shortsighted disaster. Microsoft had maintained a little discipline and strategic focus on the Direct3D API they could have ensured that there were NO other consoles in existence in a single generation by using the XBOX XBOX to Strengthen the PC gaming market rather than inadvertently destroying it. While Microsoft congratulates itself for the first successful U.S. launch of the console, I would count all the gaming dollars collected by Sony, Nintendo and mobile gaming platforms over the years that might have remained on Microsoft platforms controlled Microsoft had maintained a cohesive strategy across media platforms. I say all of this from a past tense perspective Because, today, I’m not so sure that I’m really all that unhappy with the result.

The new generation of consoles from Sony AND Microsoft have Reverted to a PC architecture! The next generation GPU’s are massively parallel, general-purpose processors with intimate access to the shared memory with the CPU. In fact, the GPU architecture Became so generalized that a new pipeline stage was added in DirectX 11 DirectCompute called that simply allowed the CPU to bypass the entire convoluted Direct3D graphics pipeline in favor of programming the GPU directly. With the introduction of DirectCompute the promise of simple 3D programming returned in an unexpected form. Modern GPU’s have Become so powerful and flexible that the possibility of writing cross 3D GPU engines directly for the GPU without making any use of the traditional 3D pipeline is an increasingly practical and appealing programming option. From my perspective here in the present day, I would anticipate that within a few short generations the need for the traditional Direct3D and OpenGL APIs will vanish in favor of new game engines with much richer and more diverse feature sets that are written entirely in device independent shader languages ​​like Nvidia’s CUDA and Microsoft’s AMP API’s.

Today, as a 3D physics engine and developer I have never been so excited about GPU programming Because of the sheer power and relative ease of programming directly to the modern GPU without needing to master the enormously convoluted 3D pipelines associated with Direct3D and OpenGL API’s. If I were responsible for Direct3D strategy today I would be advocating dumping the investment in traditional 3D pipeline in favor of Rapidly opening direct access to a rich GPU programming environment. I personally never imagined that my early work on Direct3D, would, within a couple decades, Contribute to the evolution of a new kind of ubiquitous processor that enabled the kind of incredibly realistic and general modeling of light and physics that I had learned in the 1980 ‘s but never believed I would see computers powerful enough to models in real-time during my active career.

HTC Mini – the latest version of HTC Smartphone Has Officially Introduced in Indonesia

Some time ago, mobile phones HTC has officially launched in Indonesia. This smart phone can be considered quite innovative and brought some technology like the HTC BlinkFeed, Zoe HTC, HTC and HTC Sense BoomSound. Such features make this smart phone is different compared to other Android phones. To reach consumers who preyed HTC cheaper and smaller, HTC has launched the HTC Mini in Indonesia.

Arguably HTC Mini is a mini version and lighter than previous versions of HTC. This smart phone has embedded 1.4 GHz Qualcomm Snapdragon processor core 400 using two with Adreno 305 graphics. In addition, this phone is using the Android operating system V 4.2.2 Jelly Bean.

 

For full technical specifications from one HTC handset are as follows:

4.3 “720 x 1280 Super LCD2 capacitive touchscreen Corning Gorilla Glass 3
HTC Sense, HTC Blinkfeed, Boomsound HTC, HTC and HTC Zoe Ultrapixel
Qualcomm Snapdragon Processor 400 Dual Core 1.4 GHz
Adreno 305 GPU
1 GB of RAM
16 GB internal capacity
4-megapixel rear camera and 1.6 megapixel front camera
Micro USB 2.0 connectivity, DLNA, Bluetooth, and Wifi
Already equipped with Android Jelly Bean version (4.2.2)
As seen above, the hardware specs of HTC HTC Mini One to see cuts. However this phone still has good hardware specs and still be able to enjoy the features of HTC BlinkFeed, Zoe HTC, HTC and HTC Sense BoomSound innovative. To use the camera itself is still the 4-megapixel resolution with HTC Ultrapixel technology. From the look of the phone, similar to the HTC One and still made of aluminum but with a plastic periphery.

The phone is perfect for those who want to try and enjoy the features of HTC but with a cheaper price. For in Indonesia itself about the price and when it was officially sold there is still no clear information, the new HTC Mini HTC just introduced officially in Indonesia

SkyDrive Windows 8.1, Download File Without Internet

Washington – After releasing a preview version of Windows 8.1 recently, Microsoft said that the final version will be released in August 2013. Windows 8.1 users will soon be able to access files on a Windows cloud-based storage service, SkyDrive, without having to connect to the Internet.
Microsoft announced that SkyDrive will be accompanied by support for offline access. Through SkyDrive service, users will be able to determine which files can be accessed without connecting to the Internet and then downloaded to the user’s device automatically.
Files that can be accessed offline will be easily identified when the user opens SkyDrive. In addition, Windows 8.1 users can also store files on SkyDrive in offline mode, which then can be directly uploaded when connected to the Internet network.
Tami Reller, Chief Financial Officer said the company’s Windows Windows 8.1 will be completed in August 2013. Reller did not say when the user can install updates to Windows 8.1. But, Reller showed several new features and functionality in Windows 8.1.
Windows 8.1 users will be looking for music that is integrated with Xbox Music and can share web pages into Xbox Music application to create playlists. Another breakthrough designed in Windows 8.1, namely, Miracast. This displays renewal HD video and audio from Wi-Fi to the other views, such as TV. And many more other renewal in Windows 8.1.

Chat apps ‘Nimbuzz Messenger’

Chat on Instant Messenger Applications Through One Application

Nimbuz is a place for you who have an account on a popular instant messenger applications. So by using Nimbuzz, you can chat on Yahoo Messenger account, Facebook Chat, or Google talk, through one application that is Nimbuzz.

d6-435-nimbuzzMetode it uses diverse, can through desktop applications that must be installed first, or can be directly through the browser by logging in using an account on the official website through Webchat feature. To check if the connection is using Nimbuzz goes well, provided a useful bot to respond to the chat that you do.

Nimbuzz also has the facility to call abroad at a cheaper cost than using regular telephone rates. You do this by purchasing credits through Nimbuzz! Out with the subscription fee is offered. Similarly, the video chat facility if you have a webcam or a camera that supports it. Share files such as audio, image, or video are also possible as most similar applications.

Nimbuzz Messenger comes complete various platforms, ranging from PC desktop to mobile. For the mobile platform is available on Android based operating systems, iOS, Blackberry, Windows Phone, Symbian and Java. Nimbuzz also features a comprehensive list of some brands of mobile phones that support this application. And users can perform a cross-platform chat so that you more easily in touch with relatives in cyberspace.

ASRock Z87 Pro3 Motherboard BIOS Update superior to Newest Version 2.10

In order to improve the guide string UEFI on existing multi-lingual features, ASRock has just released the latest version of the BIOS 2:10 to Z87 Pro3 motherboard superior.

The release of the latest version of the BIOS itself consists of 3 files that show 3 different methods of renewal, which is a Windows-based file, a DOS compatible package, and one upgrade BIOS using Instant Flash.

If you update the BIOS with Windows method is preferred, run the exe file on the new platform of the existing windows, and then after everything is finished restart the computer. By restarting the computer BIOS must be updated automatically so that with so make sure you do not interfere with the existing process. And after the next reboot, make sure that you have loaded the BIOS to default settings.

Slightly different to update BIOS with Windows method, the DOS BIOS update methods, first of all you are required to boot through disk containing the bootable BIOS package. But if you do not have a bootable DOS disk, be sure to make your own bootable disk containing the BIOS package there. Once you have done through the boot process and appear disk drive DOS prompt “[drive]: >” on the screen, type the name of the executable file in question and follow all instructions on the screen to upgrade your lead on the restart the computer and the BIOS will be updated after that.

While the procedure Instant Flash, you can upgrade the BIOS using a flash upgrade file that contains a copy of which can be downloaded via the official website of the vendor. By pressing the F2 key when booting, and select the Instant Flash utility under the Tools menu. Compatible version of the show, you are allowed to choose one of them.

Apart from the three methods offered BIOS update, basically manufacturers do not recommend that you update your version of the BIOS for the system to work properly and will not be responsible for damage caused by errors in the existing BIOS update procedure. Therefore, make sure you know exactly everything first before doing the procedure updates the BIOS to the latest version.

Facebook rallies 30%, logs best day ever

Facebook shares rallied an impressive 30% Thursday, allowing the stock to book its best one-day gain ever. And while shares remain about 10% below the May 2012 IPO price of $38, analysts are predicting that Facebook is finally on its way to reaching, and even crossing, that threshold.

“Facebook delivered its strongest quarter yet as a public company — results that we think could be thesis-changing for many,” said Doug Anmuth, a JPMorgan analyst who boosted his price target to $44 a share from $35.

Investors and analysts are most impressed by Facebook’s growing strength in mobile advertising — a part of the business they were initially most concerned about since Facebook lacked a clear strategy for mobile advertising despite the rapidly growing number of people using Facebook on their mobile phones and tablets.

“One year into Facebook’s mobile advertising efforts, mobile has increased from zero to 41% of total ad revenue,” Anmuth highlighted in a note to clients.

While the improvements have been gradual, Facebook blew everyone away this past quarter by generating 50% more in mobile ad revenue than what Wall Street was expecting.

Even after that stellar quarter, analysts say growth should remain strong as Facebook continues to shift toward more social ads that will become increasingly valuable to advertisers.

Analysts at JMP Securities, who increased their share price target to $38, said that social media giant’s second-quarter results suggest that “Facebook is increasingly becoming a ‘must buy’ for advertisers.”

Goldman Sachs analysts were also excited by Facebook’s significant improvement in mobile advertising. They put a bullish price target of $46 on Facebook shares.

“We continue to believe Facebook is at the center of the mobile ad revolution and see considerable opportunity for it to drive higher pricing on its ad units as brand and direct marketers alike take advantage of its broad reach and precise targeting,” said Goldman analyst Heather Bellini.

As Facebook (FB) shares surged, a number of investors were getting in on the action. Over 360 million shares of Facebook had exchanged hands Thursday, more than seven times the stock’s average daily trading volume.

The day’s surge pushed the value of Facebook to more than $80 billion, up from just over $60 billion as of Wednesday’s closing bell.

Facebook’s advance was also getting plenty of attention on Twitter.

3-D Gun Printing: Here’s the Software That Stops It

New software has been developed that aims to restrict the manufacture of firearms that have been created using new 3-D technology.

The world’s first gun made using 3-D printing – called “The Liberator” – was successfully fired on May 6 in Austin, Texas. In just three days the blueprint created by startup Defense Distributed to produce the plastic gun had been downloaded around 100,000 times, according to Forbes Magazine.

Anti-gun campaigners have criticized the project, whilst lawmakers in different U.S. states have moved to pass new legislation to prohibit the manufacture, sale and use of the digitally made firearms. And now Danish startup Create It REAL has produced software that it says blocks users from printing guns in the first place.

“The likely buyers are 3-D printer manufacturers who want to minimize their liability risk and offer a firearm parental control feature to their customers,” Create It REAL’s CEO Jeremie Pierre Gay told CNBC.

“The feature creates a unique digital fingerprint of the firearm…the manufacturer could decide to block the print or to simply give a warning to the user of the potential danger.”

The software has taken a year to develop. The firm realized there was a gap in the market after surveying end users and 3D printer manufacturers. Gay told CNBC that his previous job working with Digital Rights Management (DRM) for technology firm Motorola served him well.

“[We realized] people are interested in the ability to put a lock on their firearms at home, the same should be possible on a 3D printer as a parental control feature. I would say that this feature is customer driven even if they did not know they wanted the feature when we asked,” he said.

“The possibility to make a firearm at home is not new, there are many plans on how to do it on Internet, the problem with 3-D printing is that it could become simply too easy, this feature makes it more complicated again.”

Cody Wilson, the man behind nonprofit Defense Distributed was skeptical that the product would actually be able to prohibit the printing of guns, which are produced as separate parts and then assembled. “The Liberator” is printed with hard plastic and fires a standard .380 caliber bullet. The only non-printed piece is a common hardware store nail which is used as its firing pin.

“Such software must walk a very fine line, of which I’ve no doubt it is incapable…It’s interesting PR to the uninitiated only,” the 25-year-old law student at the University of Texas told CNBC.

“‘The Liberator’ pistol is an assembly of over 17 parts, most of which individually would not set off a detection software unless the exact model was blacklisted. Think about it, springs, hammer, even the grip. These are not ‘guns’.”

‘Wild West’ Regulation

Both New York City and New York State have introduced legislation to curb the making of 3-D printed firearms or ban their use altogether and similar bills have been introduced in California. Linda Rosenthal, a New York State assembly member told CNBC that New York’s bill is currently sitting with the Codes Committee after the legislative session ended on June 21.

“I have all intentions of pursuing this legislation next session,” she told CNBC. “The controversy surrounding the passage of the SAFE ACT (the New York Secure Ammunition and Firearms Enforcement Act) will make passing any piece of gun-related legislation that much more difficult, but I think this is a very important issue that must be addressed before the technology becomes widely available.”

New York State should act as a leader to provide a national model for other states to follow, Rosenthal said, and welcomes the new software that she says is effectively the industry regulating its own behavior.

“It is important that industry and government partner with each other to address this issue. Neither acting alone will be able to accomplish enough to deter dangerous behavior….Given that no technology is foolproof, it is critical that the states have strong tools available to discourage wrongdoing and criminalize bad behavior,” she told CNBC.

“3D printing is a very new technology, and in terms of regulation, it is the Wild West out there.”

Despite the issues surrounding the manufacture of firearms, 3-D printing – creating three-dimensional solid objects from digital models – is gathering momentum and is transforming everything from medicine to home goods. Printers that once cost $30,000 now are priced closer to $1,000 and have the potential to rewrite the rules of global manufacturing.

The market for 3-D printing was estimated at about $1.7 billion in 2011 and could hit $6.5 billion by 2019, according to research firm Wohlers Associates.

Talk Nokia Lumia Excellence 1020

HELSINKI – 41 megapixel camera is a major advantage presented by Lumia 1020. But Nokia claims that there are many other advantages to selling this smartphone.

Head of Marketing and Sales of Nokia in North America Matt Rothschild, said the camera is not the only advantage Lumia 1020. Hardware elements, including the AMOLED screen to accessories Camera Grip and shutter button, says Rothschild, a smartphone is another advantage.

“On the whole, this is what we refer to as the volume of product.’s (Lumia 1020) is very beautiful to grip, has a good balance, and well designed.’s What we call a consideration” Rothschild said, as quoted from Venture Beat, Monday (15/07/2013).

As for the camera, said Rothschild, Lumia 1020 has a very broad target audience, ranging from the professional to make photography only as a hobby.

“But the most important thing for us is that when we talk to customers, they tell us that they want to have a good picture. Everyone knows that smartphones now include photographic device that can be carried anywhere, so this is the core thing that all people are looking for,” he concluded.

Tablet Tabulet Tabz Voice

One of the local players present Tabulet one variant tablet at an affordable price. Tabulet Tabz Voice offers its flagship feature supports high definition multimedia content and also provides GSM slot that can be used to make phone calls and SMS. By using a battery capacity of 3000mAh, the device is claimed to last longer: 7 to 8 days in standby mode.

u2-436-TabuletDengan adds Voice name, this device is indeed favor the communication features via phone and SMS. All are able to run smoothly without any obstacles, it’s just a network that supports only a 2.75 G which is less convenient if you want to use it for internet connection. The solution, Voice Tabz support the use of USB modem via USB OTG cable.

Using unibodi design, no casing that must be removed. You just insert your SIM card if you want to use it for communication via phone / SMS. However, this function does not support hot swap. If you directly install the SIM card without turning off the tablet, the SIM card is not immediately detectable. You can do the first restart to activate it.

Tabulet Tabz Voice has front and rear cameras, and you can use the front camera for video calls. However, this function can be done using third party applications such as Skype. Rear camera with 2 megapixel resolution to produce images that are optimal in conditions outside the brightly lit room, in addition to the results look less blurry and the colors are natural. You can also use it to record video with VGA resolution.

To run high-definition video, which has included supporting applications Super HD-Player. Video codec and format support diverse, ranging from RM / RMVB, AVI, MKV, MOV, DAT, FLV, H.264 to be directly executed here. So if you have a video file and move it to this tablet, can be directly executed without the need to convert it first. Output sound through the internal speakers produce a sound that is quite loud, but it just is mono only. If you want to produce stereo sound, can use headphones.

The default application is fairly small but has included three exciting games are Angry Birds, Fruit Ninja, and Temple Run. To support your activities, of course stay put himself through Google Play.

One interesting alternative to the tablet at an affordable price offered by Tabz Tabulet Voice. His ability to make phone calls and SMS as well as running high definition video to excellence that is rarely found in the price range of millions.