Old Model of depending of public contributions to fund government spending will not work anymore, it is not sustainable

From Chua Soo Kiat
04:45 AM Dec 01, 2012

As pointed out in the letter “Dual public housing system is sustainable, better: SDP” (Nov 22), revenue from land sales goes into government reserves. As seen in the Housing and Development Board’s (HDB) latest annual report, its land cost was S$4.4 billion.

Thus, if we were to accept the Singapore Democratic Party’s (SDP) proposal of Non-Open Market flats, which strips out land cost, we would have to accept lower contributions to the reserves from land sales.

The more NOM flats there are, the less the contributions and the lower the net investment returns contributions (NIRC) in the future, ceteris paribus.

NIRC is a key government revenue component to help fund programmes from education to defence and, increasingly, social and medical spending.

With NOM flats, the tax increase the Prime Minister spoke about at the National Day Rally may come sooner than projected, as taxpayers would be asked to pick up any shortfall in NIRC.

The psychological impact of NOM flats cannot be underestimated, too. Between an NOM flat and an open market flat in a similar location, Singaporeans who want to be owner-occupiers would naturally go for the former.

After all, land proceeds can be viewed as contributions to a national social fund, so why contribute if you have a choice? Furthermore, social benefits such as Goods and Services Tax Vouchers and Workfare bonuses do not depend on how much land cost one pays.

In fact, one may get more social benefits in a NOM flat, as it should come with a lower assessable value, given the restrictions on rental and sale. Strong demand for NOM flats would curb any price appreciation in the open market, and a vicious circle may form.

Nevertheless, the SDP’s NOM paper is a commendable effort. I hope to hear from ministers and elected opposition members on this, given its proposed impact on the reserves and the value of HDB flats.


NOM flats will only work if you restrict the sale of the flat ie if the family does not need it or upgrade, he must return back to HDB at a price determined by HDB, inflating HDB land to market forces will only create an endless bubble where there is no end to the high cost of everything in Singapore, if you cannot create a range of flats to cater to the diverse interest of public needs, then what is HDB setup for? Due to restrictions, the impact on the property market will be minimum but it will cut down the demand of an ever increase that has no end, creating affordable flats for the masses. The NOM component can be based on 50% of the market land costs, in this way, you still derive a profit but much lower, there are hundreds of other ways to make money like creating more shops, shopping centres, amenities, mrt etc, and it will create a better public opinion for the government. Ideally, the location of these flats should not be in central but sub-urban areas, where there is widespread land to be used to built new towns. You want to raise billions of dollars to enjoy a Swiss Standard of Living? Then start privatising the balance of HDB flats area by area. Let private bankers control the expansion of credit so as to control inflation, there will be liquidity in the markets to fund anything you want. Example valuation of flat is $500K, paid up is $300K, so the person cannot borrow more than 80% or 70% of the valuation of the flat, so you would have created an expansion of credit for $X amount of dollars.

– Contributed by Oogle.

Making all the wrong moves

By Paul McDougall  InformationWeek
November 30, 2012 09:08 AM

Surface may be just the beginning. Microsoft CEO Steve Ballmer told investors at the company’s annual shareholder meeting that pressure to innovate and get new products to market faster means Microsoft will need to produce more of its own hardware going forward.Responding to a shareholder who questioned why the company has fallen so far behind in the tablet market, despite the fact that chairman Bill Gates, present but silent throughout the Q&A session, showed off a tablet prototype more than a decade ago, Ballmer said Microsoft may have been too reliant on external hardware makers to develop the product.

“Bill did hold up a tablet a number of years ago,” Ballmer said. “And, not that we don’t have good hardware partners, but sometimes getting the innovation right across the seam of hardware and software is difficult unless you do both of them,” Ballmer said at the meeting, held Wednesday in Bellevue, Wash.

Referring to the company’s new strategy of building its own Windows 8 tablets under the Surface brand, Ballmer said “maybe we should have done that earlier, maybe [Gates’] tablet would have shipped sooner.”

Ballmer then left little doubt that Microsoft is no longer content to be solely reliant on third-party PC manufacturers for its success. “What we’ve said to ourselves now is that there is no boundary between hardware and software that we will let build up as a kind of innovation barrier.”

[ Windows 8 early sales numbers are weak. Should Microsoft pull a Coke and introduce Windows Classic? ]

Ballmer’s comments are sure to add fuel to rumors that the company’s next hardware product will be a self-branded smartphone. Digitimes, an overseas publication that tracks Asian component manufacturers, this week published a report that said Microsoft has contracted with iPhone-maker Foxconn to produce a smartphone. Microsoft has not commented on the report.

Another sign that Microsoft is looking to become a major player in hardware is that it’s looking to hire individuals with experience in hardware manufacturing and supply-chain management. On Thursday, the Microsoft Careers Twitter feed carried a tweet saying, “At @Microsoft, we’re more than just software. Come show us your Hardware Engineering talents.” It included a link to Microsoft’s hardware engineering jobs page, which listed numerous open positions.

Microsoft’s hardware designs could bring it into conflict with PC OEMs, on which it still depends for the bulk of its Windows revenues. Lingxian Lang, China operations manager for Acer, recently said publicly that Redmond’s plan to compete with partners would ultimately see it eating “hard rice.”

Meanwhile, Microsoft has revealed more details about Surface Pro, the version of Surface that will run full-blown Windows 8 Professional and legacy Windows applications. The notebook-tablet hybrid will hit stores in January, starting at $899 for the 64-GB version and $999 for the 128-GB version.

Microsoft released Surface RT, which runs a pared-down version of Windows 8 called Windows RT, on Oct. 26. Surface RT runs an ARM-based Nvidia Tegra 3 chip that promises low power consumption and long battery life, but it will run only pre-installed Microsoft software and apps downloaded from the Windows Store. Surface RT starts at $499.

Upgrading isn’t the easy decision that Win 7 was. We take a close look at Server 2012, changes to mobility and security, and more in the new Here Comes Windows 8 issue of InformationWeek. Also in this issue: Why you should have the difficult conversations about the value of OS and PC upgrades before discussing Windows 8. (Free registration required.)


You did not achieve perfection for Windows 8, now you want to get involved in hardware business, very soon, everything will fall apart when all your hardware vendors abandon you, what you should do, is to setup another division to offer consulting services to hardware manufacturers to fully customised PC, tablets and smartphones for different industries, and developed expert expertise on software so that you can easily customised your OS and applications, you are lacking behind because you are now a dinosaur difficult to embrace change and innovations, and not hungry enough for success, exploiting technologies for maximum profits but never keep a wildcard under your sleeve, sitting on a pile of cash and doing nothing will definitely get you into stagnant waters.

– Contributed By Oogle.

HFT : An interview with Expert Edgar Perez

Wed, 28 Nov 2012 09:12:00 GMT

Edgar Perez

In this interview for HFT Review, we talk to Edgar Perez, author of The Speed Traders, who will be hosting the HFT Leaders Forum in London on 12th December 2012.

As one of the HFT industry’s most well-connected individuals, who is a frequent speaker at industry conferences and who is regularly invited to discuss the topic on media channels around the world, Edgar is well-placed to deliver his views on the changing nature of high frequency trading in todays markets.

HFT Review: Edgar, welocme to HFT Review. To start with, what is your own definition of high frequency trading? 

Edgar Perez: Personally, the definition of high-frequency trading is not set in stone. What I can give you is a definition that is practical at this moment: high-frequency trading is really a number of techniques to facilitate speed trading. This is of course an evolving definition. Whatever is HFT today was probably ultra HFT in the past and will probably be low HFT in the future. What we can say at this moment about HFT is the big impact speed has on it.

Speed plays an important role and it is a big component of the definition of HFT. By that we mean speeds in the millisecond or microsecond range. In addition to that, in order to minimize the latency of the trading systems, co-location is an inherent part of the definition. As speed traders try to reduce any latency, they will want to trade from computers hosted on the exchanges themselves.

As a trader, you don’t want your computers to be placed somewhere far away from the exchanges. Therefore, you rent space from the exchange; that will be the second component.

As third component of the definition, you are talking at people who trade in and out throughout the day. As a high frequency trader, you are not interested in holding instruments for any longer that it is required by the strategy, and almost never overnight, as that would expose to additional risks. It’s been said that high frequency traders are in the moving business and not in the storage business. That means that they will buy share and then sell them at a profit almost immediately. It that is not the case in the transaction itself, it will become profitable thanks to the rebate that exchanges pay traders either for providing or taking liquidity.

The fourth component I can think of is that we are looking at trading instruments that are very liquid; if you look at the markets, the most liquid instrument in recent months was probably Citigroup: it was liquid, low priced and therefore easy to buy and sell, until they engineered a reverse merger. Bank of America eventually took that place.

When you are in HFT, you are basically looking at margins that are very very small. It could be one-tenth per penny for a transaction in the market-making business; therefore, you want to make sure that you are using your capital effective and efficiently.

That means speed traders will want to trade the lowest-priced instruments (e.g. Bank of America), as the profit stays constant for each transaction, and the percentage margin will be higher. That is why trading liquid and low-priced instruments becomes an important component of HFT.

As you can see from this comprehensive definition, not all orders generated automatically by computers can be called high frequency trading. Trading has to somehow include speed, co-location, limited overnight exposure, and highly-liquid instruments, to potentially qualify as high-frequency, in my opinion. My opinion will be different to the opinion of other experts, practitioners or academics; that’s why I think spending time in coming to a standard definition of HTF is wasteful; industry participants would rather discuss manipulative practices that can be facilitated by these technologies, and the best way to monitor and eliminate them from financial markets.

 HFTR: This is a very fast-moving industry. What have been the most significant changes to the HFT landscape since your book “The Speed Traders” was published early last year? 

EP: From my point of view, the most dramatic change is happening in the U.S. equity markets (where it all started), with declining volumes for the 3rd straight year. As you can see in this recent post, http://thespeedtraders.com/2012/11/12/the-new-york-times-reports-declining-u-s-high-frequency-trading/, in my blog http://thespeedtraders.com, this phenomenon, coupled with the lack of volatility (reflected by the lower share of HFT participation in the volume), is negatively impacting the industry. HFT relies on volume and volatility; there are certainly macro factors that go beyond the equity markets, but this is pushing HFT firms that focused only in equities to explore other asset classes and monitor new geographies, as well as to develop services business leveraging their infrastructure in place.

Another important change in the last year has been the regulatory aggressiveness experienced by the industry in certain European countries with initiatives that have gone from financial transaction taxes to practically limitations to HFT activities. Any financial tax would conceivably be catastrophic for both high-frequency trading and the financial systems that adopt it. That would not affect long-term investors who exhibit low-turnover in their trading. But people in HFT who are relying on small margins to be profitable will be impacted by any transaction tax; let’s remember that for market-makers, margins may in the range of one-tenth of a penny per transaction. Any transaction tax will wipe out any margins there.

HFTR: What do you see as the biggest challenges facing HFT firms today? What about the wider industry?

EP: After the financial crisis of 2008, influential audiences blamed financial firms for it and part of that was focused on high-frequency trading. Therefore, they are assuming that taxing financial transactions will help solve many of the problems currently affecting the economy, as that would raise billions of dollars. My sense is that any transaction tax would be ultimately passed to investors and consumers; therefore, I don’t think there is any efficacy in imposing transaction taxes. It is going to be difficult game as well; neither the US nor Europe will want to be the only block adopting the financial transaction tax, if they finally get to agree on a proposal. Ultimately, financial participants will just move somewhere else; financial authorities in Asian and Latin American markets will no doubt welcome them.

HFTR: Where are the main HFT opportunities today (in terms of asset class, geography, trading strategy, etc)?

EP: Given declining volumes in traditional asset classes in developed markets, it is only natural to expect players focusing on new asset classes and new geographies, employing multi-asset trading strategies. While there are additional costs and limitations in many of these emerging economies, players will be smart to monitor the evolution of the regulation in these geographies, and be ready to jump when the opportunity becomes available, which  will be reality sooner or later.

HFTR: How do you anticipate HFT-type trading strategies (such as electronic market-making, cross-market arbitrage, rebate capture, etc) will evolve in the face of increasingly restrictive regulatory proposals? 

EP: Low-hanging fruit strategies are characterized by limited shelf-life profits, and that has been the case with the most popular ones used by high-frequency traders. For instance, market making; the firms that were super successful in this game (e.g. GETCO) are continuing investing huge amounts of money merely to maintain their competitive advantage and at the same time trying to find new revenue/profit sources leveraging their expertise. Why? Declining volumes and declining volatility spell trouble for them. We are getting to the point that even rebate capturing strategies are under the microscope and risk continue being enabled by the exchanges, so the only alternative is moving to more IQ-intensive strategies that can be more sustainable in the long-term.

HFTR: From your perspective, which regulatory proposals make sense and which are ill-thought through and unworkable? Is there anything that regulators should be doing that they’re not? 

EP: HFT, like any other areas of finance, should have sensible regulations imposed to promote sound trading practices and protect the average investor from predatory behavior. If a non-HFT market participant believes that he/she cannot enter into fair transactions then that individual will not invest in that market. To create that atmosphere of trust and effectively regulate global financial markets, regulators must be armed with the capacity to analyze trading activity in real time.

If HFT is predicated on speed, its regulation must also be built around the same requirement. Real time information will allow regulators to see everything that is occurring in the markets, no matter how quickly the order information is being posted and transactions are occurring. This will require significant commitments to invest in both human capital and information technology; however, it is vital for regulators to level the playing field of HFT in order to best supervise it.

Real time policing of the marketplace for potential malfeasance is the most efficient way to regulate HFT. Just a few months ago, the SEC missed an opportunity to move in this direction faster by approving a rule requiring exchanges and broker-dealers to provide trade information to a central repository by 8 a.m. the next trading day, as opposed to the original real-time reporting requirement. While either development would have taken years to be implemented, once in place it will have the potential to immediate alert and stop erroneous trading activity such as the one experienced by Knight Capital for 45 minutes on August 1.

HFTR: Have the markets (particularly the equity markets) become too complex for the majority of end-investors, in terms of proliferation of order types, proliferation of trading/matching venues, etc? What can/should be done about this increasing level of complexity?

EP: Complexity can be a blessing in disguise. It could be a blessing because investors benefit from the access to a number of different venues to trade, all of them competing with one another on price, time and service.  On the other hand, complexity brings unknown challenges, which can only be dealt with once they become known after an unexpected incident.

During the past four decades, a wave of innovation has reshaped the way financial markets work, in a manner that once seemed able to deliver only huge benefits for all concerned. But this innovation became so intense that it outran the comprehension of most of their actors, not to mention regulators. Now it is time to leverage this progress in tracking and monitor this complexity in real time, as the only way to guarantee that benefits all market participants with minimum disruptions.

HFTR: What do you see as some of the most disruptive technologies coming along in the financial trading space? 

EP: In an industry predicated on speed, the disruption will come from technologies that bring the communication latency to the minimum. While most financial services firms have relied on cable, there’s only a limit to the speed that can be achieved. In this regard, I am particularly excited about the application of microwave technologies to this challenge.

HFTR: Thank you Edgar


The requirements of HFT is you need to have multiple newsfeed for major events happening in the world to link to your HFT computers to analyse the movement of the markets by algorithms and develop a strategy thru different analytic tools for different views to take advantage of the movement for minor changes of underlying asset by going in for volume to maximise your profits, Speed Traders very unlikely hold their positions overnight. As such there must be regulation for Speed Traders not to exceed their credit which orders need to be matched with HFT servers just as fast thru an interface where orders are queued and matched, and those are not matched after a period of time will be discarded. The problem is developing this customised solution you need to identify all the requirements to customised both your hardware and software, including your interfaces to handle the load with failsafe mechanism or if problems occur, who will be responsible for the loses? Present technology is limited and a change of mindset to reinvent the HFT Trading System from the ground up need to be done. I can easily visualize everything from hardware, software, interface with a Total solution to look into every aspects, even designing all the software to make sure the failsafe mechanism can be monitored by huge data centres to control every single processes, so that in the event of a problem, everything can be rectified. Nobody in the world has my expertise and solutions. I want to see who will lose more, everyday I delay, everybody will lose hundreds of thousands, I have great patience and tolerance, nobody can cheat me.

First Newsfeed module.

Like Google News’s design, you need to identify and categorise all the different categories of news. Put on a Timeline and use a parser program like those used in marking exam papers. Only thing is the algorithm must be able to identify keywords, paragraphs and the entire page. And calculate how it will affect the different markets from a + or – point. You will also need a failsafe add-on that can identify the source of the news and if it is true or false, by comparing it’s effects on the markets, or from different sources, if it is not right then it will be highlightened to be removed from the newsfeed system, or everything will be skewed.

– Contributed by Oogle.

Breaking Puzzle Solved : Black Holes are entrance/exit to another Time and Dimension

If my theory is correct, black holes are entrance to another time and dimension, so what you observe and calculate according to applicable laws will not apply, the mass in one dimension is different from the tunnel that is so small like the needle of an eye that can squeeze a camel thru the eye of a needle, no applicable laws can explain this theory, so you need to create new laws based on the breakdown of the spectrum of light, to calculate the exact mass, and the mass that is missing when it travels and reaches thru the centre of the black hole. When conduction was factored in, the X-ray emissions expected from the black hole in the model matched up well with what astronomers actually see. The end result is that the Milky Way’s black hole gobbles up 100 times less material than astronomers had previously predicted because the 99% is spiralled thru the tunnel to travel into another space and dimension. Likewise when a person dies, his spirit(human aura) will have a mass(not dark matter) that cannot be detected, and will travel thru the centre of the galaxy to another dimension thru black holes. Therefore different black holes are entrances to different  dimensions. Time travel is possible but Mankind has not learn how to reassemble the molecules at the end of the Travel The Philadelphia Experiment. UFOs are secret experiments conducted by the US into time travel, they have succeeded to create a window into the future, but not time travel itself. Mind Control and Thought broadcasting is real but a manifestation of the spirit.

But NGC 1277 is stranger still, and could help advance our theories of how black holes evolve in the first place.

“This galaxy seems to be very old,” Dr Van den Bosch said. “So somehow this black hole grew very quickly a long time ago, but since then that galaxy has been sitting there not forming any new stars or anything else.

“We’re trying to figure out how this happens, and we don’t have an answer for that yet. But that’s why it’s cool.”

Why there are no new stars but the mass is so massive is due to the other exit from another black hole, where even if the galaxy is small it has no effect on the massive black hole, you may enter from another black hole that absorbs all mass, it will therefore end up here, this is the characteristics of the exit of a black hole.

– Contributed by Oogle.


Saturday, Dec 01, 2012

SANTIAGO – US astronomers have detected the most powerful blast from a quasar ever, offering the first proof of important theories about why the universe is shaped the way it is.

The beam of energy, detected by the European Southern Observatory’s Very Large Telescope, based in Chile, was at least five times larger than any observed before.

The new analysis identified a huge flow of energy – 2 trillion times as powerful as the Sun and 400 times more massive – streaming from a quasar known as SDSS J1106+1939.

“I’ve been looking for something like this for a decade,” said lead researcher Nahum Arav, from Virginia Tech University, “so it’s thrilling to finally find one of the monster outflows that have been predicted.”

Quasars are celestial bodies that look like extraordinarily bright stars. But astronomers now believe quasars are not stars at all, and that they draw their power from the enormous black holes at the center of newly forming galaxies.

Because they are so far away – meaning it has taken billions of years for their light to reach even the most powerful telescopes – quasars provide glimpses of the ancient history of the universe.

And while black holes are known for sucking energy in, quasars also take some of the energy around them and shoot it back into the universe at high speed.

Astronomers theorize that these energy flows help explain why there are so few large galaxies and how the mass of a galaxy is linked to its central black hole.

But until now, the powerful beams of energy were merely speculation.

“This is the first time that a quasar outflow has been measured to have the sort of very high energies that are predicted by theory,” Arav said.

Quasar SDSS J1106+1939 had already been identified, but this was the first time its outflow had been accurately measured in great detail.


There are two types of exit from black holes, one is dorment with enormous mass but not visible to the naked eye while the other is located near a quasar where the black hole is, where all materials is spit out back into the galaxy. Why black holes’s exit is dorment or acts like a quasar? I suppose it is due to the different dimension of space and time but I cannot explain why, anybody want to try? The answers can be found in golden ratio of logarithmic spiral, time dilation, half life and 4D where special relativity will not apply when you travel faster than the speed of light. In the Philadelphia Experiment, the laws of special relativity applies because time travel cause dilation and decay, that is the factor why the molecules cannot form back together, causing humans to become embedded on the ship, if we could chart everything from one end to the pin centre and out to the other end in 4D, we may find the reason why with computer simulation. We must look at two different views, where you travel slower than light and the laws of special relativity applies, and when you travel faster than light when there is no formula at present. An extension to the laws of special relativity?


The black hole puzzle solved
One Dimension leads to the past and future Dimension
Light and matter spiral towards the centre where it is compressed until forces concentrate towards the eye of a needle, and is being forced thru towards another dimension, which time dimension depends on if it is faster than light or slower than light, when it is slower than light, it will dilate and decay, maybe move into another dimension which is the past, from when the dimension it enters, and if it is faster than light, it will be narrowed and concentrated and will split out everything in it’s beam, forming a quaser like output into the future. You need to differentiate space/time in a vacuum, dark matter, faster or slower than light, time dilation and decay,  and gravity of subatomic particles before you can merge the theory of everything, and harvest nuclear fusion energy.

In 1997, Juan Maldacena noted that the low energy excitations of a theory near a black hole consist of objects close to the horizon, which for extreme charged black holes looks like an anti de Sitter space. He noted that in this limit the gauge theory describes the string excitations near the branes. So he hypothesized that string theory on a near-horizon extreme-charged black-hole geometry, an anti-deSitter space times a sphere with flux, is equally well described by the low-energy limiting gauge theory, the N=4 supersymmetricYang-Mills theory. This hypothesis, which is called the AdS/CFT correspondence, was further developed by Steven GubserIgor Klebanov and Alexander Polyakov, and by Edward Witten, and it is now well-accepted. It is a concrete realization of the holographic principle, which has far-reaching implications for black holeslocality and information in physics, as well as the nature of the gravitational interaction. Through this relationship, string theory has been shown to be related to gauge theories like quantum chromodynamics and this has led to more quantitative understanding of the behavior of hadrons, bringing string theory back to its roots.

Internationalising the Global markets

The Swap and Futures market will expand in rapid growth if it is de-centralised than when it is not, which depends on the capitalisation of the local markets, the cost of interests where competition will drive costs down.

In finance, a swap is a derivative in which counterparties exchange cash flows of one party’s financial instrument for those of the other party’s financial instrument. The benefits in question depend on the type of financial instruments involved. For example, in the case of a swap involving two bonds, the benefits in question can be the periodic interest (or coupon) payments associated with the bonds. Specifically, the two counterparties agree to exchange one stream of cash flows against another stream. These streams are called the legs of the swap. The swap agreement defines the dates when the cash flows are to be paid and the way they are calculated.[1] Usually at the time when the contract is initiated at least one of these series of cash flows is determined by a random or uncertain variable such as an interest rate, foreign exchange rate, equity price or commodity price.[1]

The cash flows are calculated over a notional principal amount. Contrary to a future, a forward or an option, the notional amount is usually not exchanged between counterparties. Consequently, swaps can be in cash or collateral.

Swaps can be used to hedge certain risks such as interest rate risk, or to speculate on changes in the expected direction of underlying prices.

Swaps were first introduced to the public in 1981 when IBM and the World Bank entered into a swap agreement.[2] Today, swaps are among the most heavily traded financial contracts in the world: the total amount of interest rates and currency swaps outstanding is more thаn $347 trillion in 2010, according to Bank for International Settlements (BIS)



The Global Financial markets will change, where there will be different exchanges for equities, bonds, derivatives and futures, each having a local flavor but will be linked together worldwide eg Japan, London, US, China, Singapore each have their own exchanges dealing with different instruments, there will be an option to raise funds or trade in different markets, where it is different from the structure of present markets where it may not be efficient to cater to the needs of the diverse markets, the future is each exchanges control their own but co-operate to access other markets, nobody likes to be under an obligation where the local laws do not apply, as it is too complicated to apply for jurisdiction to cover every markets. If there are disputes it will be a big problem for arbitration, if UN is asked to solve all these problems at the highest courts, there will be no time or resources to do anything else. Likewise it is the same for Maritime Laws if you are able to group together in a region, it may simplify things. Those markets that do not have volume will combine to have a central Financial market in the region. At the end it is those who enjoy economies of scale and could cater to demand will enjoy brisk business.

– Contributed by Oogle.

You need to set realistic goals, a priority list based on your budget, solve the most pressing problems that affect your bottom line first(Strategy)

Published: November 28, 2012

CHICAGO — It was supposed to be a moment for celebration: United Airlines observing the delivery of its second Boeing 787 Dreamliner with a flight from Seattle to Chicago earlier this month for a select group of employees, while senior officers, including Jeffery A. Smisek, United’s hard-charging chief executive, served Champagne and took lunch orders.

But before the flight took off that morning, a computer glitch in one of the airline’s computer systems delayed 250 flights around the world for two hours.

So it goes at United these days. The world’s biggest airline, created after United merged with Continental Airlines in 2010, promised an unparalleled global network, with eight major hubs and 5,500 daily flights serving nearly 400 destinations. As an added benefit, the new airline would be led by Mr. Smisek of Continental, which was known for its attention to customer service.

But two years on, United still grapples with myriad problems in integrating the two airlines. The result has been hobbled operations, angry passengers and soured relations with employees.

The list of United’s troubles this year has been long. Its reservation system failed twice, shutting its Web site, disabling airport kiosks and stranding passengers as flights were delayed or canceled. The day of the 787 flight, another system, which records the aircraft’s weight once passengers and bags are loaded, shut down because of a programming error.

United has the worst operational record among the nation’s top 15 airlines. Its on-time arrival rate in the 12 months through September was just 77.5 percent — six percentage points below the industry average and 10 percentage points lower than Delta Air Lines. It had the highest rate of regularly delayed flights this summer, and generated more customer complaints than all other airlines combined in July, according to the Transportation Department.

The airline even angered the mayor of Houston, Continental’s longtime home and still the carrier’s biggest hub, when it unsuccessfully sought to block Southwest Airlines’ bid to bring international flights to the city’s smaller airport, Hobby. 

The United-Continental merger is weighing on the company’s finances. It took a $60 million charge in the third quarter for merger-related expenses, including repainting planes. It also took a $454 million charge to cover a future cash payment to pilots under a tentative deal reached in August.

While most large airlines reported profits this year, United has lost $103 million in the first three quarters of 2012, with revenue up just 1 percent to $28.5 billion. Its shares are up 7 percent this year compared with a 12 percent gain for the Standard & Poor’s 500-stock index and a 24 percent gain for Delta.

“United remains at a challenging point,” analysts from Barclays wrote last month, and they forecast that the carrier would not begin to see the benefits of its merger until late in 2013 and into 2014. Still, while airlines initially struggle, mergers increase revenue eventually, as the example of Delta’s acquisition of Northwest Airlines demonstrated two years ago.

Mr. Smisek, taking a break from serving coffee halfway through the maiden 787 flight, acknowledged that things were not going as fast as expected, particularly given the aggressive targets he set two years ago. Back then, Mr. Smisek said the merger would be wrapped up in 12 to 18 months. He has since learned to be patient, he said.

“It is still a work in progress,” he said. “The integration of two airlines takes years. It’s very complex. If you look at where we were two years ago, we’ve come a long way.”

Admittedly, the process is complicated. Airline mergers mean combining different technologies, often old computer systems, as well as thousands of procedures used by pilots and flight dispatchers, gate agents, flight attendants and ground crew.

Setbacks are common. Like United, US Airways experienced a breakdown in its booking technology after its combination with America West in 2005. Delta’s on-time performance fell sharply in the year after its purchase of Northwest.

But today, Delta is a leader among big airlines in on-time performance. US Airways had a record third-quarter profit even though it still lacks common work rules for its pilots seven years after its merger.

United has completed many of its merger tasks, particularly as far as passengers are concerned. It has received its single operating certificate from the Federal Aviation Administration, allowing it to run a combined fleet. Despite all the problems this summer, it claims to have finally merged the reservation and technology systems.

Mr. Smisek said passengers would see the benefits of the combination by next year as United introduces new features on its planes, including satellite-based Wi-Fi, flatbed seats in business class and bigger overhead bins on its fleet of Airbus narrow-body planes.

One of the remaining sticking points, however, is getting employees of the two merged carriers to agree to a single contract. Pilot unions signed a tentative agreement with the company in August, after months of bitter negotiations. Talks are continuing for agreements with unions representing flight attendants and mechanics.

“There always seems to be some bump in the road,” said Ray Neidl, a senior aerospace and airline analyst with the Maxim Group. He said much of the merger’s benefits would kick in after the airline got its collective agreements with its work force. “Once they get these challenges out of the way, United will be a powerhouse.”

For many analysts, United’s real challenge lies in combining different work groups with different cultures, values and ways of doing things.

That is particularly true for United, which had a history of sour labor relations, and Continental, long considered one of the nation’s best-run airlines.

“You know, the cultural change takes time,” Mr. Smisek said. “And people resist change. People are sort of set in their ways.”

He added the airline was now intent on providing better operational performance and consistently good customer service. “And there are people who don’t like that,” he said. “I understand that. What I want is those people to either change or leave.”

There are few lasting advantages in the airline business. Airlines can easily match what rivals are doing, whether by lowering fares, buying new planes or installing new features on their aircraft. But United insists that its network remains its most resilient strength and will help it attract more passengers.

The carrier’s dominant market share at Newark Liberty International Airport, for instance, appears unassailable and provides a formidable gateway to the New York and East Coast markets. United’s Houston hub is a major jumping point to Latin America. And United is the biggest carrier in San Francisco, giving it an advantage in the Pacific, where it is the biggest American carrier.

United is counting on new planes to make a difference in coming years. It has made a big bet by ordering 270 planes over the next decade, including 50 Boeing 787s and 25 Airbus A350s.

The 787’s long-range ability and relatively smaller size will allow United to add new direct service between cities that did not have enough traffic to justify bigger planes like the Boeing 777. Its 787s will fly between Houston and Lagos, Nigeria, starting in January, followed by service from Denver to Tokyo’s Narita airport in March, and from Los Angeles to Tokyo and Shanghai.

United is betting that passengers will be drawn by those new services as well as by the 787’s carbon-fiber technology, which allows higher levels of humidity and oxygen in the cabin and can, Boeing claims, help reduce jet lag-related fatigue.

The airline moved its headquarters to the Willis Tower in Chicago last year. In June, it set up a new Network Operations Center, occupying a full floor in the tower in a vast open space previously used as a trading floor. From here, managers run daily operations, overseeing flight schedules, crew availability, weather forecasts and any delays throughout the system.

After the summer’s mishaps and poor performance, United has improved its on-time record. In particular, it said, arrivals on-time this month were 85 percent.

“We think we’re in a good spot given where we are in the merger,” said Peter D. McDonald, United’s chief operations officer.

Still, perceptions may be tough to fight, particularly online and in frequent-flier forums, where criticism of United’s service and performance has been particularly bitter. One critic, who goes by @FakeUnitedJeff, parodies Mr. Smisek on Twitter. One post last month read: “It’s raining in Newark. I wish we’d bought waterproof aircraft. Cancel, cancel, cancel.”

Why you need customised hardware and software for this industry

By Sarah McBride

SAN FRANCISCO (Reuters) – General Electric (NYS:GE), famous for branded hardware from lightbulbs to turbines, is pushing hard into software, chief executive Jeffrey Immelt said at the Minds and Machines conference on Thursday.

“In an industrial company, avoid software at your own peril,” Immelt said in a discussion with Marc Andreessen, the famously pro-software venture capitalist, and Chris Anderson, editor of Wired magazine.

Software and analytics have becoming increasingly important for many of GE’s clients, Immelt said. A group of GE’s radiology customers, for example, told him earlier this week that if given $10 million to invest, they would put it into analytical tools like post-processing algorithms; ten years ago, the answer would have been brain-imaging machinery, he said.

The same thinking applies to GE customers in areas such as energy. “The grid has tremendous opportunities for application of both software and analytics,” he said.

In some areas, “a software company could disintermediate GE someday,” Immelt said. “And we’re better off being paranoid about that.”

General Electric last year generated some $42 billion in revenue from services – a higher margin business that Immelt sees as key to growing the company’s profit in a weak global economy. The new push into software is a bid to expand its service offering while also defending its position in big industrial markets such as electric turbines, jet engines and railroad locomotives.

On Wednesday, the company said it was planning a new push into services.

Andreessen suggested a good way for old-line companies to gain an edge in software is to invest in leading software start-ups.

Just as he admitted corporate venture-capital had a bad reputation for investing in young companies at the height of bubbles, Immelt jumped in. “We’re the dumb money,” he quipped, as the audience laughed.

Andreessen stood his ground. “Engaging in the ecosystem is a big deal,” he said. “Showing up with a check is an excuse to have the conversation.”

His love of software aside, Immelt defended GE’s core businesses.

“In many ways, our products are more resilient than things like computers,” he said, citing the huge capital costs required. “The moat around this baby is wide,” he added, waving at a large jet engine on the side of the stage.

Andreessen too had some kind words for hardware, citing Lytro, the camera company his firm has backed. “If you had told me three years ago we would fund a camera company, I would say no way,” he said.

Andreessen Horowitz, his venture firm, has also backed a stealth robotics company and Jawbone, the maker of headsets and a health-monitoring device, he said.

Both men finished by getting back to their core strengths. Andreessen said 2012 would be remembered as the year of software as a service, in the style of companies such as Workday (WDAY.N) and Salesforce (NYS:CRM).

Looking forward, Immelt said 2013 would be the year of manufacturing, when companies see it as a competitive advantage and society sees it as a creator of middle class jobs.

“It’s a time when manufacturing has become sexy again,” he said. (Reporting By Sarah McBride and Scott Malone; Editing by Tim Dobbyn)


The demands of this industry requires you to fully customised both your hardware and software so as to fulfill the needs of the technical specification, or else you will have problems like HFT Trading and the breakdown of MRT trains in Singapore, off the shelf computers cannot handle the demands of industrial needs, if you save money this way be prepared for problems and breakdowns. I study the total environment, from the computer CPU to hyper-threading processes and system performance and cache memory, together with software that handles external applications for interfaces for load balancing, if you do not study the total environment, how to detect problems before it happens, I break everything down and then find out the requirements, to create a technical specification list, without doing so you are asking for trouble, millions could be spent on engineering and industrial equipments but everything will become worthless when there is constant breakdown and you will never get back your ROI. Mission critical applications cannot use standardised computers.

– Contributed by Oogle.