Not really a VxRack test
This is more a scaleIO test. Ideally you'd run the VMs on the compute nodes in VxRack
273 publicly visible posts • joined 19 Feb 2008
I remember loosing a document my mum was writing.
When you started a document it would write a header to the disk.
If the disk was almost full and you tried to save you could not save.
You then lost the document when you inserted a new disk.
Fortunately you could drop to the OS and delete some stuff, but I managed to exit the word
processor completely hence loosing the document.
With more and more stuff going offshore this problem will only get worse,
especially with some of the Job titles some Jnr Staff abroad seem to get.
Director of technology? After two years on the Job? Totally inflated egos.
Working as a team gets harder and harder, eventually everybody becomes a lone
wolf as there is no team.
Also with the offshoring you no longer get the Jnr Staff you can train up locally teach them
those tips and tricks and useful shortcuts.
Personally sharing information means I get to be able to say "no" when workload gets too high but in the full knowledge that somebody else in the team can pick up the work.
A few years ago I was doing an upgrade to an ERP system.
All seemed to be working fine. About two weeks before go-live UAT testing started. All of a sudden there was a massive uproar as a key piece of functionality had completely gone AWOL, all associated tables and code.
Much head scratching on our side as there were no errors in the logs and everything had gone really well up to that point.
So we cross referenced the production system still on the older release, traced the functionality only to find none of the tables/programs where marked as vendor specific code or even customer specific code.
After much digging we found out that the Vendor had demo'd some beta code on their system but it was never purchased.
Some bright spark then decided (contractor), to export the tables/programs from the demo system and import it all into the normal DEV/QA/Prod landscape, passing it off as his own code, so had managed to "save" the company a stack of cash for the functionality they needed.
Alas as the code was not signed correctly it got deleted on the upgrade.
Much embarrassment later they had to hand over vast amounts of cash to the vendor (backdated licences for 5 years), the go-live date was missed due to having to import the official code base, create routines to export the data and reload, and do another round of regression testing.
Unlike a processor where size does matter, is it really that important for storage where the form factor is largely irrelevant?
Obviously it's cheaper to make chips at a smaller scale, but the cost of the Fabs is quite considerable.
Why not have say 5x500GB flash chips in a single 2.5" enclosure with say a RAID 5 controller built in.
That would give you a 2TB disk with some level of redundancy.
The FUD is that HANA can only work with clustered memory systems. Not true, it will scale to the size of the largest box given to it.
All well and good scaling up to 64TB. Do you know of any SAP customers who currently have ERP systems that are currently 640TB is size (assume 10:1 compression ratio).
Even the largest systems hover around the 20 - 30 TB size, so will compress down nicely to 2/3TB.
So will work with a
The M series may be the Maclaren of in memory systems, but 99% of customers only need a fast bus, or maybe even a fleet of buses, but they will still come in cheaper than a propriety closed architecture, with oracles lock-in and appalling customer service.
Also if you want to run the SAP Analytics on your oracle box, you need layers and layers of software to extract and transform the data, into the exalytics engine adding more and more complexity.
@MadMike
FUD
SAP is about to release a version of HANA on IBM Power so P795 with 16TB and above
They also run on very large single node x86 systems (e.g. SGI with 64TB of Ram).
It's also about cost/performance benefit. For the performance a customer requires, two fully loaded blades may well be totally adequate, rather than going for a costly single node or propriety and costly oracle solution.
Being x86 based you can easily move to cloud based solutions, changing providers with ease (HANA also work under VMware so very portable)
Also you have the problem that the New SAP applications will be written for HANA only with some SQL middleware sat in the middle parsing SQL statement to then read from memory.
This is not about making an existing Application run faster (with all their bad sql statements, duplicated tables, etc), this is about designing applications from the ground up with new in Memory data models in mind.
It is also about abstraction. Rather than loads of code with duplicate SQL statements, it designed to be based on objects, e.g. purchase orders, finance documents, making development time quicker.
HANA is not just a database, but contains analysis libraries, geospacial functionality, planning/graphing, search/text analysis/ Landscape transformation and Data Services, all built in.
All objects/APIs that can be called without having to directly access a table.
@SecretSonOfHG
It's always hard to undertake such a project, But you have to look at your competition.
Are they able to close a period in a matter in minutes, are they able to do real time replenishment/forecasting.
Most adoption at the moment (I work for a SAP hosting/consultancy) of HANA at the moment is around reporting, but companies are really thinking about adoption for the back-end, Memory prices will come down (although SAP do need to rethink their license model)
But all new SAP functionality and real time business process will be on HANA only.
Also a lot of customers want to move away from Oracle as the back-end at the moment the migration cost will be the same regardless of target DB.
HANA is not restricted to the memory footprint of a single machine, and can scale out, by adding new blades.
You can also relegate less frequently used data to near line storage if you want.
There are different routes to HANA even with your existing ERP system.
a) A side car where you move specific objects to HANA (your backend still being a traditional DB), where the SAP kernel reads from HANA and writes to both DBs.
b) A normal DB migration (SAP has had those tools for years
c) Re-implementation specific functions (e.g. Finance) on a new System (the S4 HANA platform)
Once your "old" ERP system is on HANA, you can either implement new functionality on the new platform, but you can also implement it on the same platform and move it later.
Every new platform/technology is going to be disruptive at some point in time. But your traditional ERP system will still be supported until 2025 so plenty of time to migrate.
What MS does need is a truly high end shiny bling phone.
Nice metal case, 5.5/6 inch screen, good camera, lots of storage (+SD slot), with the latest
and greatest processor and lots of memory.
Then have some "can't quite afford the flagship phone" models.
The OS isn't that bad. Having gone from iOS to Android and having used my sons Windows phone, I actually prefer the Windows tiles over Android, which to me does not have a consistent look and feel, it's a bit bitty. Still not as polished and consistent as iOS, but Apple just didn't have the phone I wanted at the time so I went HTC one.
It's a pity that AMD have taken their eye off the ball in terms of innovation on x86 side of things, especially in the server space.
No new Opterons with greater core density, no revamp new FX processors with the enhancements they made to the APUs in terms of per CPU core performance.
Yes they realise that they went down the wrong path with the modular cores but they could have at least tried to keep up.
I'm surprised revenues are still so low though given the XBOX and PS4 deals.
AMD were such an innovative company starting with the various extensions (3Dnow), on chip memory controllers, multicore, x86-64 extensions.
Personally I think they should drop the Ax processors for the Laptop/Low end PC market, take the FX line and add GPU functionality and bring the price down, at least they will be able to compete with the i3 and low end i5 range.
Certainly alot of my customers have lost alot a trust in Oracle, especially after the HP/Itainium debacle.
The Oracle DB is also no longer the most capable DB on the market and pricing is far too high.
The hardware pricing is also no longer competitive, with most customers moving to AIX if they still want proprietary unix.
I was at a technical conference in Berlin last week. Oracle had the biggest stand with the most sales people, yet the least number of visitors
I don't think we are there yet, but I can imagine a future where your phone/tablet are your compute device.
The Operating system will have different personas (this is where windows phone/RT may have a head start)
When undocked the OS (and apps) take on mobile persona all based around touch and non multi tasking work loads.
You then dock at your office/home and a desktop persona comes into play, allowing multiple monitors, multi tasking use of a mouse and keyboard
If you need real grunt a server is only a network cable or cloud away
You would think the underlying hardware would not matter. But it does.
For highly critical systems, companies will choose a commercial Operating system with an absolute proven track record, scalability and performance.
So that leaves the usual three, AIX, HPUX and Solaris (will ignore windows for the time being).
None of my customers are buying HP-UX any more and Itainium is truly dead in my opinion.
Oracle have lost a lot of trust in the market place (in part due to the HP-UX debacle) and SPARC does not scale as well in single threaded performance as x86 or POWER. Again customers who used to be loyal to Sun are very much looking at IBM for mission critical systems.
So why not Linux on large x86 servers? x86 does not scale well as socket counts go up, Also there is no longer the single neck to choke when things go wrong. Yes people are buying linux on x86 in droves for application servers and non critical systems and doing it very cheaply thank you.
Back to Windows, they do not own the hardware stack so again no single supplier to string up.
Well you can now get Elise S1s for around the 6 to 8 grand mark these days.
Not quite as raw as a 7 but still puts a smile on your face.
I'm planning on building a 7 with my son for his first car based around a really low powered engine doner car to keep his insurance as low as possible. He'll learn how to maintain it in the process as well.
His Arrogance may have helped build Oracle to what it is today, unfortunately he believes his own FUD so much he is not looking reality in the face. If you are looking at cloud you are generally not looking at a homogeneous stack.
If a service is running in the cloud the end user/customer does not give a flying f* which database is running in the background. All they want is
a) A service that provides the functionality they need
b) A service that meets the SLA
c) The ability to chop and change quickly between rival services
Squeezer, that is mainly what I was talking about. In general people who buy non pop music still generally go for the CD route.
It's the compression that gets on my goat.
Any studio I've been into (including an invite around Abbey Road), does not use directional snake oil cables. Yes they are low capacitance low resistance cables when it comes to the pure analogue audio path with very good connectors (for reliability reasons).
It is a pity that in the digital domain the cut-off frequencies are 20Hrz and 20kHrz, I'm a great believer that there are certain frequencies we feel which contribute to the overall mood of the music that are not audible.
I've never heard a classical recording that can quite capture a real concert. Or a recording that can capture the mood of a live rock performance with 18" bass bins where you can feel the punch of the bass drum against your chest. No matter how loud and high quality the recording and the equipment.
You average Studio, generally has a minimum of two sets of mixing monitors. A high end, flat frequency monitor and a shitty set of speakers typical of a consumer unit.
On First pass when mixing levels, adding compression, FXs, etc the high end monitors are used. The producers/engineers ears are trained to hear individual instruments, so they concentrate on ensuring that that they sound good (not necessarily an exact reproduction of the instrument per se).
Once the mix is completed it is mastered down to a stereo image. First using the high end monitors to get the EQ/compression just right. -This is the mix you really want to hear but never really get the chance.
Then it is passed to the shitty monitors, where additional compression/EQ is used so it sounds reasonable. The difference between these two mixes is night and day. Listen with your eyes closed and you get fantastic stereo separation, cymbals ring beautifully, you can hear (if you are in the know) what kind of guitar amp is being used and feel the real force of a good singers voice.
So what we get delivered on a CD is actually a pretty crap representation of the recording anyway. All the dynamics will have been compressed away and "loudness" will have been added. The last Metallica album being a prime example (the guitar hero mix was actually better).
Now your average listener does not know the difference between a Marshall amp or a Fender amp,
a shure microphone or an Audio Technical microphone. So generally it does not really matter.
Also hi-fi gear is not designed (despite what the manufacturers say) to give a perfect flat frequency response (studio gear is), it is designed to sound "nice" for what ever material is thrown at it.
Listen to the crap mix through the quality studio monitors and it will sound horrid, with horrid top end and far too much bass.
I used to have a 4S but when it came to renewal, apple did not have the phone I wanted (i.e. a bigger screen) so I did what many did. I went with the HTC one.
However I've never really got on with Android, I find the whole interface and usability just a bit messy.
I do however have another year to go on my contract. I'll install IOs 8 and my ipad and see how it goes. 5.5" is however too big for my liking. Pity the 4.7 does not have the optical image stabilization.
Well lets hope the itouch or the new ipods get a 128GB option soon.
My old 120GB Classic has served me very well. Great on the train where you can never get a 3g/4g signal.
Battery last the entire return journey and does not waste the battery on my HTC one (which struggles to keep its charge for the day as it is).
It plugs directly into my car again not reliant of 3G/4G and I have my entire CD collection on it.
Yes it also plugs into docking station at home. Until we get truly universal and reliable internet coverage (also on tube/train) for free there will always be a place for a personal player.
Even my 15 year old daughter separates music and phone, even though her phone as far more storage.
Maybe it's time to move processing back into the source. Have a separate device for HDMI switching (and basic audio pass through) and have an amp with 7/10/12 channels with no processing.
You then have an optical connection from source to amp. With a connection to the switcher to determine which source isin use.
Unfortunately the SI's/Business Analysts are just not coming up with new ideas that real time systems can bring.
Just accelerating the compute time of existing business processes is not enough, we need new ideas new ways of tackling business that can take true advantage of real-time processing
Until something else that is truly OS independent (i.e. looks and feels exactly the same regardless of OS) makes MS Office redundant, Windows will stay.
Why cross company compatibility (yes I know there a problems between versions). But I pretty much know I can go to company X use their laptops/desktops, exchange documents with my laptop without any major issues.
I remember many years ago having some Sun consultants on site who had to you Solaris on their laptops, we had no end of problems exchanging documentation, presentations, etc, etc
I've tried Libre office, I find it a little clunky, like all opensource software it lacks shine and usability. Why?
Opensource just cannot afford to employ armies of ergonomics experts, armies of artists to produce fancy colourful icons, etc, etc.
It will take Apple to finally let go of OSX to give MS a run for it's money, but they will have to make sure that Pages is 100% compatible with office.
I've had nothing but scalability problems with the T range of processors.
Whilst they look good on paper with loads of threads, we have found that for a heavily loaded
transactional systems, they perform like a dog.
Why they are great when you have lots of parallel processes that require very little CPU power,
but if you fire 256 threads reasonably heavy threads at the thing it just stalls giving very slow performance per thread with unhappy users.
A CPU that is less threaded (i.e. regular SPARC), we've found it far more efficient for the software thread to wait until the faster core becomes available and users are happy as their transactions runs in a decent time.
MS have gone about this all wrong.
Desktop mode should have stayed the same (aka win 7), with Metro as an optional shell.
And the Marketing, they should have taken that scene from Avatar where they are viewing something on a big screen and swiped the image to a mobile device.
Now if a surface automatically switched to a desktop mode when docked to a decent monitor and switch back to mobile view when un-docked it could be a great device, mistake was going ARM I guess.
I built a new PC six months ago. Long time Windows 7 user
I initially flirted with OSX, but could not get the thing perfectly stable and half the free apps I use were not available and couldn't use my paid for apps. Also the auto indexing would kill the Sata channels when indexing the SSD, causing the system to freeze until it finished (also and issue with windows 8/8.1)
Then I flirted with Linux (Mint), too many config options in too many different places and I just find the Gui an unpleasant experience, then there was the endless command line commands required to get certain apps working. And I couldn't use my Windows paid for apps.
I could have gone back to windows 7, but had a copy of Windows 8 lying around, installed and upgraded to 8.1
I use the system mainly in desktop mode (icons on the screen and in the task bar), with two monitors. And to be honest my workflow has not changed a bit compared to windows 7. If I do require apps that are not on the desktop, I don't really care if I find it on a full screen Start Menu or the little one as in Windows 7.
However I did do a quick install of windows 7 to see if there were any performance differences.
Windows 8.1 does perform between 5% to 10% better compared to 7.
Get a small HP microserver with plenty of disk (you're looking at between 1-5GB per movie depending on encoding).
Install Plex.
Chromecast now has a plex client (as do ithings, android and windows).
It's a great piece of software which will download all the DVD covers and other metadata as well.
We'll according to my daughter anyway.
Now an iwatch will have to have a tiny battery, but will probably draw a lot of power.
Now I like a watch to tell the time. Without me having to take it off my wrist to recharge (now on year 5 on my wrist watch without a battery change)
Likewise my headphones I don't want to have to recharge (I know I will keep forgetting).
I'm used to recharging my phone, it sits next to me at work plugged into a USB port, is about the right size for surfing, has a lot of storage (but not enough to get rid of my iPod Classic).
The only reason I can think of for an iwatch is FaceTime or a quick glance to see who's calling and maybe the odd fitness app,
Anything else then something like google glass is more appropriate (augmented reality).
In the past users got their training from work and did things the IT way.
Now training budgets are slashed, but users now have more computer access outside of work than they ever did have using mobile so they are now self trained and want to do things the google/apple/ way which is designed to be easy & hides the complexity.
Give a user a Mozilla client now days and all they see is a jumble of sub directories and they get turned off.
Easy for us IT guys to understand, but not the general user population.
In my opinion it was nothing do do with MS, they just happend to have an Operating system running on IBM x86 machines. In those days a business machine was text based, users where used to keyboard short cuts. And IBM was king. They sold you a mainframe/mini/green screen, therefore you went back to them for a PC, thus creating a business standard. Unfortunately like a lot of standards that made it is was not the best but it did the job.
The likes of the Amiga/Atari ST/Archimedes where considered and marketed as toys/games machines and they were custom architectures.
The likes of Lotus and Word perfect did not write their tools for these machines, they wrote them for an IBM compatible machine.
The IBM PC came in a form factor that meant you could add in additional cards, etc.
IBM then allowed third parties to make compatible machines for the home market.
In essence MS were lucky.
What the likes of Acorn/Commador/Atari, had courted Lotus/Wordperfect and got them to create file compatible versions of their apps and had bundled an IBM compatible floopy and filesystem in their machines.
Apple should use this form factor to sit between the iMac and one of these.
The Mac mini just isn't powerful enough . Whilst the iMac does the job, some of use don't need/want an all in one.
Give me one of these with an I7 crossfired games GPU, an SD slot, 256GB SSD at around £1.5k and it would sell like hot cakes.