Monday, August 31, 2015

VMworld2015 – Realize the Virtual Possibilities

Always insightful Phil de la Motte, Dir. Business Development, lets us in on the F5 highlights at #VMworld along with how to take advantage of labs and demos that are also available online. Virtually participate in the Hands-On Labs (HOL) for instance. He also discusses the latest vRealize integration and the business benefits of a combined F5 & VMware solution like being able to deploy applications faster, secure them easier and automate as much as possible. Phil includes some good things to think about when deploying virtualized environments.

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

Sunday, August 30, 2015

VMworld2015 – Find F5

I show you how to find F5 Booth 1513 at #VMworld. In my 5th year covering VMworld, get a behind the scenes pre-show view of Moscone Center along with a sneak peek at some of the cool goodies F5 is giving away this week.

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

Thursday, August 27, 2015

VMworld2015 – The Preview Video

I give a preview of VMworld 2015 happening August 30 – Sept 3 in San Francisco. Along with the expected 23,000 attendees, F5 will be present in Booth 1513 to help you realize all the virtual possibilities of the Software Defined Data Center. ‘Ready for Any’ is the #VMworld theme for 2015 and F5 is ready for any of your questions about virtualization and cloud technologies.

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

Thursday, August 20, 2015

Our Five Senses on Sensors

5-sensesAristotle (384 - 322 BC) is credited as the first person to classify our five sense organs: sight, smell, taste, touch, and hearing and Immanuel Kant, a famous philosopher from the 1700s said that our knowledge of the outside world depends on our modes of perception. Our highly developed organs of the eyes, ears, nose, tongue and the skin on your hand provide the sensing equipment necessary to send that information to the brain. In some cases, one of the sensors might not work properly in the case of the blind or deaf, yet the four other senses are heightened and exceed normal operation to make up for the missing information. Daniel Kish, for example, uses echolocation like a bat to see the imprint of the sound waves as they bounce back. Pretty cool, eh?

Today, we're building gadgets that are used in conjunction with or completely taking over the the tasks of the eyes, ears, nose, tongue and hands. Things that were always part of our body are being replaced with micro-chipped things that act like, attach to - or better yet - integrate with our body.

Sight: Of course there are security cameras to help us see our homes when we are away and most of us have heard of Google Glass but there are now eyeglasses being prototyped by BMW’s Mini division. They are combining the wearable with the connected car. These glasses communicate with the car via WiFi and offers a heads-up display like no other. While you can still see the real world, the glasses offer an overlay of speed, navigation, backup cameras and more. You can see just how close you are to the curb from the wheel's point of view. You can also look at a street sign and have it come to life with other overlays or additional info. While most of the data is just telemetry for now, engineers are looking to possibly incorporate driving features within the view. This is where IoT gets interesting - where one is used to compliment another. Also, Swiss engineers have developed a camera based on the human retina. Understanding the biology of the real thing, they've made a more efficient camera.

Smell: Although there were attempts earlier, in the 1940-50's, Hans Laube created a system called Smell-O-Vision which would emit odors during the movie so the audience could smell what was happening in the movie. It was only used once. GE also developed a system in 1953 that they called Smell-O-Rama. Now you can get a smell app on your phone. ChatPerf is a thumb-drive-sized atomizer that plugs into your mobile device so it can be triggered to release specific odors on command. But those are scents out. Machines that can whiff stuff in have been around awhile. Think of your smoke, carbon-monoxide or radon detectors. Today we have wearable vapor sensors that can smell diabetes. Scientists have figured out how to use a sensor to identify the odor from melanoma to detect this form of skin cancer. Those human skin cells give off an odor that doctors can pick up with a sensor. And scientists in Israel who have already developed a nanotechnology breath analyzer for kidney failure are working on one that can distinguish between the breath of a lung cancer patient verses a healthy exhale. Crazy!

Hearing: According to U.K. firm Wifore Consulting, Hearable technology alone will be a $5 billion market by 2018. Roughly the size of the entire wearable market today. Ears are able to capture things like oxygen levels, electrocardiograms, and body temperature. While sound drives the bulk of technology within this space, those ear buds could soon have technology that not only sends sounds but also captures some of your body information. And it is small enough and discrete to wear everywhere rather than carrying a mobile device. Initial uses trend with fitness. Ear buds that play music but also give you feedback on your workout. There are also smart earrings that monitor heart rate and activity. I've always said that there will come a time when we all have IPv6 chips in our ear and we'll just tug the lobe to answer a call. Carol Burnett would be proud.

Touch: Want to give a robot the ability to feel? Done. Researchers have developed a flexible sensor able to detect temperature, pressure and humidity simultaneously and a big leap towards imitating the sensing features of the human skin. While still in the early stages, future sensors could be embedded into the "electronic skin" of prosthetics, allowing amputees sense environmental changes. Another is BioTac, a fingertip that can sense force, temperature, and vibration—in some cases better than a human finger. With laser 3D printing, some orthotics can be delivered in hours rather than months.

Taste: Sweet, sour, salt and bitter used to be the domain of the tongue. Soon, electronic 'tongues' could be used to monitor the quality control of bottled water. Using chemical sensors, researchers in Texas have demonstrated that the electronic tongue can 'taste' different solutions. The sensors responded to different combinations of the four artificial taste elements with unique color combinations of red, green and blue. This enabled the device to analyze for several different chemical components simultaneously. I've written about smart chopsticks that can detect oils containing unsanitary levels of contamination, a fork that monitors how many bites you take and a smart cup that counts the amount and calories you drink. This is the Internet of Food.

Wearables make technology personal and our five senses are what helps us navigate life, gives us perspective. Who would have thought that an individual's perspective would someday become embedded within coded software.

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

Friday, August 7, 2015

That's a Wrap from #F5Agility15

Our video partners at cloud-channel.tv, put together a nice wrap up #F5Agility15 from Washington DC. Special thanks to F5’s Manny Rivelo, Ken Salchow, Joe Pruitt and Tony Hynes along with VMware’s Jared Cook. Also want to thank the F5 Studio and a huge thanks to the many attendees for a wonderful week! And as always, Mahalo to you for watching. Reporting from National Harbor MD, That’s a Wrap!

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

F5 DevCentral Solves Your BIG-IP Questions

In this lively chat at #F5Agility15, DevCentral members Joe Pruitt and Tony Hynes share a little history of how it has grown from a single server in 2003 to over 200,000 members today; how iRules, iControl, iCall and iControl interact with BIG-IP’s programmability features and how the community helps solve, share and answer some of the challenges of today’s hybrid environments. They also highlight the MVP program and some of the new personal customization coming soon.

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

Thursday, August 6, 2015

Software Defined Data Center Made Easy with F5 and VMware

Jared Cook, VMware’s Lead Strategic Architect, Office of the CTO, visits #F5Agility15 and shares how F5 and VMware solutions can be used together in an orchestrated fashion to enable customers to spin up applications on-demand, and provision F5 software defined application services those applications need to run successfully, with more greater ease and automation than before in the Software Defined Data Center.

ps

Related:

Connect with Peter: Connect with F5:
o_linkedin[1] o_rss[1] o_twitter[1]   o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]