Robert Reich commented today on Bernie Sanders and in that commentary was this quote from Sanders noting that; “there are people out there, Donald Trump and others, who are attempting to do what demagogues have always done, and that is instead of bringing people together to address and solve the real problems that we face, what they try to do is tap the anger and the frustration that people are feeling and then divide us up. So we have a message to Trump and all the others out there who want to divide us up: No, we’re not going to hate Latinos, we’re not going to hate Muslims, we are going to stand together.”
Generally I don’t comment on politics as I believe that our country is not served by it and should feature folks in charge who’s only desire is to run the country in the best interest of all people, keep us safe and for the most part stay out of our business, as long as we are not infringing on other American’s basic rights (not the long list of pet peeves we tend to cater to) and freedoms.
However I do wish to point out that Sanders is no better than any other politician in approaching the office with a preconceived “wishlist” he philosophically believes in.
Sanders on more than one occasion has sought to single out entire groups to demonize so, in true political fashion, he is as guilty of attempting to “divide us” as he states Trump and others are. The “Donald” has already joined the political club and is no better than Sanders either, lest you think I’m siding with him.
We need management at this point by folks with no financial, philosophical or long term political ambitions. Our problems now are too complicated to be resolved by a single political ideology and require only the most dedicated of problem solvers, not politicians.
I know that this is as much of a “Utopian” wish on my part, as is every political minded person’s thoughts on what their perfect world would look like, but nonetheless I urge folks to look for hard working and level headed leaders in the next turn of the election clock so that perhaps our grandchildren will inherit a better world.
As companies shape their IT budgets for the year ahead, there may be some uncertainty about the best way to proceed in the current business climate. According to the 2012 Gartner Executive Programs CIO Survey, IT budgets had been essentially flat over the 10 years prior to 2013 and they only grew roughly 3.8 percent in 2013. Research firm Gartner is reporting that global IT spending will rise for the rest of 2014 and into 2015 as the rest of the economy expands and IT vendors introduce the latest products customers care about.
Given continued pressure to create value and control costs, organizations should consider how to achieve their goals while also meeting baseline operational expectations. By making five ideas central to their planning, organizations can optimize spending for their data center solutions in the year ahead:
1. Review everything: In a 2012 blog post, HP Vice President and General Manager for Software in EMEA Rafael Brugnini noted that many organizations tend to transfer their budget from one year to the next without examining how it is distributed. Instead of keeping line-items the same and simply adjusting spending based on business conditions, companies should consider that new solutions to old problems may have become available and examine what costs it might be possible to outsource.
2. Invest in innovation: Concerns about the economy are restricting investment and change in many organizations. The implications of this pattern are twofold: Businesses may want to make investments now in anticipation of future budget cuts, and companies can gain a competitive advantage by continually improving their data center strategy. Brugnini suggested organizations devote a smaller share of budgets to operational costs and focus on selectively investing in innovation. According to Data Center Journal’s Jeff Clark, anything a company is doing well will be emulated by competitors, so improvement and innovation must be a constant if businesses hope to remain successful.
3. Improve energy efficiency: Energy efficiency may have been talked to death, but, in the face of rising costs and environmental pressure, its importance cannot be underestimated, according to Clark. Since data center power demand is only going to increase with service needs, investing in more efficient infrastructure will be essential to cutting operational expenses.
4. Factor in security: A data breach can bring serious business consequences, including fines or sanctions if the data is subject to compliance regulations. While security is a preemptive measure, it is a necessary one that must be included in the budget, Clark noted. This includes planning for physical security, a consideration that may be best addressed by using a multi-tenant data center with staff onsite 24/7.
5. Plan for the unexpected: In business, as in life, there will always be unplanned expenses – equipment failures, natural disasters and more – and companies should manage budgets to handle these costs, Clark noted. One way to control unexpected facility costs is by moving to a wholesale data center, which provides infrastructure at a fixed, manageable expense.
A Memorial Day post of a piece a group of us did in 2008. Featuring Marty Keil, Bill Barrett, Lonnie Wilson, and myself. We all salute our fallen warriors and the families they left behind…both gave up much in order for us to continue the freedom we all enjoy.
Without further adieu, The Star Spangled Banner…our way.
Security is coming up more recently in the light of high profile data thefts and new threats to your private information. According to many experts cloud computing increases data center risk, as both cloud and virtualization create new vulnerabilities.
Risks comes in both cyber and physical forms, with attacks increasing as more business is conducted via networks, data centers and end-user devices. So the question that comes up in the board room, “how little can we get away with…” needs to really be, “what can we do to ensure the flexibility to grow and expand yet keep our vital data secure?”
Many are banking on simply going to the cloud. For many industries security concerns make them reluctant to trust their data to a public cloud. Most older data centers, whether they are company owned or a co-location arrangement, were not built to meet the far tougher standards for both security and uptime. They also were built when their purpose was less critical and not designed to meet today’s computing needs.
The first advice that you will hear from experts is “don’t cut any corners”. Even with the cloud the two key pieces in a solid strategy is excellent network connectivity and purpose built facilities constructed to meet the stringent standards needed for maximum security and uptime.
When we think about physical threats to the data center our minds naturally go to the dramatic natural disaster: earthquakes, tornados, hurricanes, extreme weather, and tidal waves. Data centers should be located as far away from active disaster threats as possible.
But what about physical intrusion? Does it happen? London has experienced more than its share: burglars broke into a Verizon data center, tied up employees and stole equipment. A year earlier another burglary occurred at a Level 3 co-location center. Just a couple of examples out of many. Don’t leave your data center wide open to unwanted visitors or careless staff.
For tech companies seeking to build clouds or deploy application driven services, choose something built with at least these basic guiding principles. You need not only the “best of the best” data center security and strict carrier neutrality (don’t get tied to one carrier), but also systems that provide maximum protection against unplanned power outages and is located in an area with a very low risk of exposure to disasters.
So the question is, are you worried and is it time to consider new options?
Only you know that answer but I suggest you heed the warning – don’t short change your data center, it’s now more critical to your success than ever.
“Wikibon believes that successful enterprises will be aggressive and early adopters of a mega-data center strategy.”
David Floyer, Co-Founder and CTO, Wikibon
David Floyer only writes about major shifts every couple of years but as most will tell you when he does, successful CIOs listen. His latest report, available online here essentially boils down the next big shift to “start planning and moving to a mega-data center now”.
This is not just a hype thing, it’s a fact.
His report presents a compelling argument for all medium-to-large organizations to move their entire data center infrastructures to IaaS mega-datacenters (AKA Wholesale Data Centers) within the next ten years, if not sooner.
In his piece he describes the key challenges faced in today’s data center topology:
- Stove-piped islands of application infrastructure.
- Stove-piped IT organizations with separate groups of:
- Application development,
- Application maintenance,
- Database administrators (DBAs),
- Server infrastructure operations,
- Storage infrastructure operations,
- Network infrastructure operations.
- Slow connection to public cloud services (which are mainly used for development and edge applications).
- Poor integration between cloud & legacy applications.
- High-cost operations, slow to change.
- Great difficulty optimizing to meet application SLAs.
- Very expensive and difficult to enable real-time analytics.
He goes on to point out that, “Overall IT is perceived by most business executives as inefficient, high cost and very slow to change. Line-of-business executives are exerting pressure to use external IT services to achieve their business goals.” The problem he sums up as this sort of pressure, combined with executives seeking primarily SaaS solutions will end up with a hodgepodge of systems and data that will continue to be difficult to secure and expensive to use and maintain.
Why Mega-Data Centers?
In the report Floyer goes into to greater detail but the highlights of why he believes in mega-data centers are:
- The cost of IT Infrastructure is at least 50% lower
- The cost of implementing & managing IT infrastructure is
up to 80% lower
- Disaster Recovery Services
- Security Services
- Big Data allows the value of enterprise applications
to be vastly increased
His major conclusion; “CIOs and CXOs should aggressively adopt a mega-datacenter strategy that fits their business objectives within their industry. Enterprise IT should exit the business of assembling IT infrastructure to run applications and the business of trouble-shooting ISV software issues. The focus on IT should move to positioning the enterprise to exploit the data avalanche from the Internet and enable real-time analytics.”
Once again, as if CIOs don’t have enough on their plate, more proof that it is network infrastructures, the rise in importance of the data center and cloud technologies pushing the CIO to center stage in the C-Level drama.
Believe it or not the cost to a business of the average data center outage is going up. At $7,900 a minute, up 41% from 2010’s cost of $5,600 per minute, there are very few industries that can afford to not pay attention to data center uptime. For many of you the use of technology dependent on 24/7/365 operation of either a cloud or a data center is a matter of business survival. If your tech is down, you and your business are losing money.
For many in IT the rising cost of downtime is a known concern. Their most difficult proposition? Convincing their own management that they should be worried.
When surveyed, CEOs and other C-Level execs often say that their data centers are fine and that downtime preventative measures are an “extra expense”, easy to dismiss.
Yet a study from the Ponemon Institute shows that the cost of data center downtime is now far too expensive for these execs to dismiss so easily anymore.
If nearly eight grand a minute isn’t enough to turn the C-Suite’s hair white, then they need to consider these other findings:
- The average reported facility (power) incident length was 86 minutes, resulting in average cost per incident of approximately $690,200. (In 2010 it was 97 minutes at approximately $505,500.)
- For a total data center IT outage, which had an average recovery time of 119 minutes, average costs were approximately $901,500. (In 2010, it was 134 minutes at about $680,700.)
- For a partial data center outage, which averaged 56 minutes in length, average costs were approximately $350,400. (In 2010, it was 59 minutes at approximately $258,000.)
- The majority of survey respondents reported having experienced an unplanned data center outage in the past 24 months (91 percent). This is a slight decrease from the 95 percent of respondents in the 2010 study who reported unplanned outages.
The rise in DC downtime cost is because of the total integration of business operations into IT or data driven systems. There has also been a huge shift in where prospects and customers do business, moving from real-world interactions to cyber ones. For more companies than ever, if your website goes down or your service is off-line you’re losing both money and customers.
So how do you address the problem?
Most of the old solutions are just not enough anymore. You can go to the cloud but even there you are still dependent on data center uptime, except now your cloud provider controls whether that is cared for or not.
The solution involves power redundancies that only a Tier III Enhanced (Tier III+) data center or higher can address. Your business can’t afford a loss of $8k a minute. The report indicates that 48% of the outages were unplanned, and caused by human error. Only a Tier III+ can remove that threat. The smaller the data center the more they experience outages and the power interruptions lasts. Tier III+ facilities have a much larger footprint.
Most data center outages occur due to some form of power component failure like UPS, generators or batteries. Often these are secondary failures that occur when attempting to have these backup systems kick-in during a primary power source interruption. Only a Tier III+ or Tier IV facility is inherently capable of surviving such an occurrence.
Read the study. Assess your organization’s risk if your data center goes down and translate that into dollars for making your case for improvements. After such an assessment I think you will see why a move to a Tier III Enhanced, future proof data center may be the best way to prevent the high cost of DC downtime.
There was a time when taking space in a data center owned by someone else meant limitations on who you used for access and often high costs mixed with low performance. Thankfully the new preferred model is now available in many large markets and has come or is coming to many second tier markets. It’s called “carrier-neutrality” and sums up as a data center facility that has multiple access carriers available and no investment in your choice of who you use.
This means the facility itself doesn’t offer these services, and only facilitates your ability to connect to them through a section of the facility called the “Meet-Me-Room”
(MMR). Since the facility is not “owned” by a single network provider data center clients can choose one or more, should they seek additional redundancies, that fit their budget and business needs. This also creates a climate where access carriers are motivated to give you the best price and service as they compete for your business. Data centers that feature carrier-neutrality keep you from “painting yourself into a corner” by giving you several options and provide for greater flexibility as your needs grow or change.
The Enterprise benefits of neutrality
- Simple access to multiple providers, motivating them to negotiate the best combination of price and performance to meet your needs.
- The ability to build redundancy and resilience into your access services, for business continuity and disaster recovery purposes.
- The freedom to add or change providers as your business needs evolve with no need to physically move your infrastructure.
- The commercial relationships are completely separate from your client agreement so there is no tie-in to the data center for these services.
- Simple enabling of a multi-sourcing strategy.
Ethernet Brings Even More Savings
Ethernet technology is the most deployed technology for high-performance Local Area Network (LAN) environments. Enterprises around the world have invested network connectivity, equipment, processes, and training in Ethernet, with other protocol contenders seeing decreased IT mind-share. And while GigE is popular as a LAN solution, the significant change is Ethernet’s role in the Metropolitan Area Network (MAN) market, which has historically used SONET (Synchronous Optical Networking)/SDH (Synchronous Digital Hierarchy).
Legacy metro transport networks are built primarily of time-division multiplexing (TDM) technology. TDM “circuit-switched” services are optimized for delivering voice. The underlying technology of a TDM network is a SONET/SDH ring. A TDM network consists of digital multiplexers, digital access cross-connects (DACs), SONET/SDH add/drop multiplexers (ADMs), and SONET/SDH cross-connects. Plus SONET networks are broken into a multitude of channels as they service everything from traditional telephone service to old T-1s.
Thanks to standardization advances in Quality of Service and Operational Administration and Management (OAM), Ethernet has now supplanted SONET for Metro transport networks, capitalizing on its inherent benefit of being optimized for data transport versus voice services, as was SONET.
Ethernet transport uses Ethernet interfaces instead of SONET/SDH interfaces and is designed to retain SONET/SDH’s strengths while addressing its weaknesses.
- Bandwidth efficiency is improved through Ethernet’s use of statistical multiplexing.
- Supports higher speed services than SONET-based DS-1 and Frame Relay services.
- Ethernet supports VoIP which is more cost effective and bandwidth-efficient than voice circuit switching technology.
- Enables new network services such as SIP.
- Reduces the cost of your network access.
Ethernet transport uses a centralized management system (Control Plane) that takes the place of distributed management systems such IP/MPLS. This permits use of the same low cost Ethernet switches used in the VLAN solution while providing network control that has functionality very like IP/MPLS. With centralized control network, operators can immediately invoke prepared contingency plans for network recovery rather than rely on individual network elements
By combining the flexibility and cost competition of multiple access providers with an Ethernet network topography, clients aren’t painted into that corner that a single provider offers, avoid the pitfalls associated with SONET ring access and save money on top of it all. And after all…no one wants to be painted into a corner!
“Data center design, construction, technology and administration is sufficiently expensive, specialized and scalable, both horizontally to bigger facilities and vertically to multiple tenants, that data center operations are increasingly a game for specialists.”
Kurt Marko, IT Analyst, Information Week
When writing the above for Network Computing, Kurt was referring to the results from last year’s Uptime Institutes’ 2013 Annual Data Center Industry Survey which clearly placed wholesale data centers and their growth as a key finding. In past updates I have provided, we have noted how clouds benefit from the increasing availability of wholesale data center space, and the tier levels that need to be considered when thinking about uptime for any type of enterprise computing environment.
He also points out that the “default position” for enterprise data centers has shifted from maintaining only “in-house facilities” to outsourcing. As the Uptime report puts it “data suggests the third-party data center service providers are growing at the expense of in-house IT operations.”
Some of the reason for the shift is that wholesale data center facility providers are more in tune with the data center world. As Mr. Marko puts it: “The fact that third-party providers, being in the actual business of operating facilities and selling data center services, do a much better job of measuring, documenting and articulating their value through cost and performance metrics.”
Even more important is that through third-party providers, strategic management information is more regularly shared with and acted on by C-Level personnel. The report found that more than 70% of data center operators report cost and performance information to C-Level buyers, versus just 42% of internal data center operators. This is even more evidence that wholesale data centers are not just a new approach to enterprise data centers, but perhaps provide additional value to organizations that have wholesale data centers as part of their cloud strategy.
When you have a minute, take a look at the article and visit the Uptime summary where you can download a full copy of the survey results…I think you will find them interesting.
Melbourne Health’s recent deployment of a new data warehouse and the adoption of a business intelligence (BI) platform, to gain greater insight into how well it is serving patients while still controlling costs, is a look at what Obamacare may drive in U.S. based healthcare systems.
“Big Data” to healthcare systems really means getting control over both structured and unstructured data across multiple systems. In building a single data warehouse system, they have at least corralled it into a place where BI platforms can access it to perform analysis that can lead to pro-active solutions for healthcare problems.
In fact Healthcare reporter Paul Cerrato asks U.S. hospitals point blank in an article for Information Week, “How mature is your hospital’s business intelligence system?”. He points out that health organizations today face more pressure than ever to lower the cost and improve the quality of healthcare.
Meeting this challenge takes multiple layers of innovation that many healthcare systems, even the larger ones, are not ready to handle. Increased interest in mobile health application development, remote medical interfacing using IP video, digging through a surge in data being gathered on patients, and then protecting that data takes new levels of expertise.
Quoting Jim Adams of The Advisory Board Company, he points out that penalties that are part of the meaningful use regulations, like not reducing the 30-day readmission rate for preventable complications, could be avoided by the use of technology. “If you want to identify patients who are at high risk for hospital readmission, for example,” Adams says, “you can’t do that once a month. You have to do that daily, or even near real time,”. Yet healthcare systems are sometimes challenged by a lack of internal expertise either in data warehousing, business intelligence or application development.
Healthcare providers, faced with these new challenges, can find the necessary expertise at a far more affordable cost by utilizing outsourced resources. Business intelligence and application development are new areas for hospitals and practitioners. Major health providers, as many insurance firms did not, have not kept up with technology. Many are only now are starting to fully implement electronic health records despite the fact that the mandate has existed for many years.
Health insurers are also in the mix and between the two industries they will need more sophisticated systems to help them meet the challenges of comprehensive health insurance reforms ready to launch in 2014. Facing an aging population with an increasing prevalence for chronic illness, these organizations will be looking to technology for ways to manage healthcare costs.
Regardless, healthcare faces its greatest challenge in decades. Only by turning to technology will they have any hope of meeting these challenges successfully.
Departing from my normal tech oriented news and blog posts, I offer up a classic Kompoz instrumental jam fest. Featuring the talents of Bill “Rhythm King” Howe, Mark “Spin Meister” Bowen, Jeff “Let it Rip” Chalpan, Neil “Flying Fingers” Schmidt, Jay “King o’ the Keys” Schankman, and the one and only Neil “Mr. Selmer” Montgomery on Sax.
Jeff leads off the solo fest followed by me on baritone guitar, than Jay chews up the keys on his B3, Neil switches up the feel a bit and gives us good sax, Chalpan and Schmidt trade licks and then Mr. Schmidt makes us proud by setting his guitar neck on fire using only his fingers, after that it builds to a mind boggling “all in free for all”.
Nothing like a good old fashioned jam session to get your Friday blood moving! Here tis:
Don’t Wait Up by Bill Howe
Genuitec’s MyEclipse Blue want’s to release you from being tied to RAD “expand your development horizons.” A leader in the Eclipse Foundation open source community Genuitec provides both platforms and development tools for developers and companies building applications using the Eclipse platform.
Founded by industry leaders Borland, IBM, MERANT, QNX Software Systems, Rational Software, Red Hat, SuSE, TogetherSoft and Webgain formed the initial eclipse.org Board of Stewards in November 2001. By the end of 2003, this initial consortium had grown to over 80 members. Today the Eclipse Foundation manages the IT infrastructure for the Eclipse open source community, including Git code repositories, Bugzilla databases, development oriented mailing lists and forums, download site and web site. The infrastructure is designed to provide reliable and scalable service for the communities developing with the Eclipse technology and the consumers who use the technology.
Genuitec’s MyEclipse Blue 2014, released early in March, “is an inexpensive, open source friendly tools stack updated to support the newest WebSphere version and now includes support for lightweight IBM Liberty Profile server,” according to their PR Web release.
The introduction, written by Genuitec’s Product Manager for MyEclipse Brian Fernandes, states, “We are excited to add IBM Liberty Profile server support into our technology portfolio. Now developers are readily able to develop and deploy applications to IBM’s WAS servers or to Liberty Profile – we’re giving IBM shops more flexibility for app development than even IBM has provided so far. The high adoption rate of our Blue technology speaks to the need for powerful app development tools to support existing IBM investments, such a WebSphere servers.”
The product site touts MyEclipse Blue’s ability to “Rapidly build applications for WebSphere and Liberty Profile servers. Easily configure WebSphere specific deployment descriptors, and quickly deploy applications with instant deploy technology across a number of WebSphere server versions.” The technology stack employed has assembled a broad spectrum of tech from multiple vendors to allow development and deployment to exist under one umbrella.
Called the “MyEclipse Blue Workbench” it gives IBM shops access to WebSphere Application Servers 5.1 – 8.5, Portal Servers 6 -8 and the Liberty Profile 8.5 connector which makes it even easier to develop applications against these servers. Future releases of MyEclipse Blue will feature continued focus on Liberty Profile support. Additionally, 14 new assist-based editors for WebSphere descriptor files have been added allowing for advanced development and cutting back on a developers need to recall the syntax of files, as they will now be able to simply click the file and make edits.
As a technology designed to coexist with IBM Rational tools, companies use MyEclipse Blue as an all-around Java development tool that can import existing RAD projects with one-click, or export projects back to RAD simply as well. MyEclipse Blue is lighter than IBM RAD, less expensive, yet more powerful as it features the latest open source and proprietary technologies, including being built on the latest Eclipse Kepler release.
If your organization is using IBM WebSphere you may want to consider development tools and environments like those that are made possible by open source communities. Tool sets like MyEclipse can make the time consuming tasks associated with developing new applications and cloud initiatives based on WebSphere technologies less of a strain on your internal development resources.
Many seasoned mobile application developers are realizing that the consumer app market is getting very saturated. They also understand that the cloud and its many permutations are creating new opportunities for more business-oriented mobile application development.
Virtually every business cloud, from customer relationship management to business collaboration to field operations control, are finding motivating user productivity and adoption means providing mobility on devices users are familiar with. Soon we will all be less interested in apps that simulate us drinking a beer and more interested in accessing useful information and data when, where and on whatever device we need it to run on.
Cisco thinks so. That’s why they entered the software development arena as part of its new Enterprise Mobility Services Platform (EMSP). Although best known for networking hardware and the infrastructure components that create Internet’s fabric, they announced this month their launch of their own set of mobile app development tools as the cornerstone for EMSP. This is part of their new emphasis to blend mobile app development with marketing.
According to Hari Harikrishnan, Senior Director of Cisco’s Services Platforms Group, “up to 50 percent of all marketing hires in 2014 will have a technical background. The shift is happening,” Harihrishnan said, “and successful mobility vendors will need to align to [line-of-business] requirements to be successful.” The EMSP will be released in late May, he said, “to meet the next wave of mobility.”
In his blog he writes about how this effort will aid businesses to build “hyper-context aware” mobile applications. He explains that, “Context aware mobile applications leverage information about ‘where’ the user is. In contrast, Hyper-context aware applications move beyond just location and include ‘who’ the user is, and ‘what’ interest or past behavior the user has exhibited. It provides pinpoint context precision, using mobile apps to enable ‘connected people.’ This information becomes the foundation for tailoring a personalized mobile application experience to the employee or customer, based on their wants and needs.”
But Cisco’s plans are the extreme and there are a number of mobile business applications out there already contributing to business mobility. However, very nearly every form of CRM, ERP and even many HR oriented on-premise platforms have increasingly become more mobile accessible.
It will be important for enterprises and SMBs alike to evaluate existing investments in CRM or ERP platforms and determine if a mobile application could increase the ROI of these systems. Doing so may well reveal your organization’s next winning move.
There will always be a market for consumer apps but the opportunities for mobile applications built to meet specific business needs are growing. Better to examine your options now than to be eating a competitors dust in the near future.
During the SharePoint conference early March, Microsoft exploded with integration after integration based on multiple acquisitions and technology platforms they are taking to the “cloud”. One of two social networking technologies the big M announced was its new search and discovery app, code-named “Oslo”. Both of the technologies are based on Microsoft’s “Office Graph” engine, which is a new information fabric that works across communications silos for Office 365 subscribers.
How’s Oslo work? According to Microsoft “Oslo delivers insights to end users about their communications, according to Gregerson. Oslo is based on Microsoft’s FAST enterprise search technology and was developed by a Microsoft team in Oslo, Norway. It’s not related to the ‘Oslo’ modeling technology that went by the same code name.”
Cem Aykan, senior product marketing manager for Microsoft Office, demonstrated Oslo, which will get a new name at product roll-out. Oslo uses “content cards” that show information associated with documents. Users can check to see what documents were shared with them, as well as the documents that are part of their “trends.”
Oslo pulls the information together via the Office Graph engine. For instance, Oslo will check a user’s calendar and see that a user was scheduled for a meeting. It can then deliver a recording of that meeting, if available, as a document for the user’s review. It doesn’t matter if content was shared via Yammer or the OneDrive for Business app because Oslo will find it and make the content available, based on the user’s trends.
Oslo joins a number of new features like support for SQL Server 2014, expected mid-2014, and new records management system features already available. The later includes ways to better utilize SharePoint’s ability to distinguish between a record and a document. This includes automatic rights-management and retention policies for documents and leaves control over the designation to the end-user.
For many organizations, defining the difference between records and documents is important because of the complexities of compliance. A record may have a specific definition according to compliance requirements; the handling of records versus documents may be carefully spelled out; who has access to records as opposed to documents within a particular site may be a factor; and what happens to a record or document once its active life has expired is almost always a consideration.
SharePoint 2013 records management offers two options for accommodating these sorts of compliance issues: the records archive, which isolates records from documents; and in-place management, which permits the handling of records and documents in a common location.
Deciding which and how best to utilize the new features, both for established SharePoint and 365 users and those new to these systems can be a daunting task. Seeking the advice of organizations with long track records working with businesses who have deployed either on-premise or cloud-based systems can use their insights to properly plan for future improvements.
Although not the only player in town SharePoint is still a major player with a large installed base. These new features and capabilities are sure to be welcome to that community of loyal customers.
In her blog “Forrester’s Top Trends For CRM In 2014”, Kate Leggett points to 14 trends that are going to drive customer relationship management (CRM) this year. A couple of the trends caught this writer’s eye and point to how critical mobile and social are in the customer life cycle.
Trend number 4 on her list states “The Mobile Mind Shift Will Force CRM To Evolve Quickly.” which clearly suggests that developments in the deployment of mobile CRM will happen quickly and 2014 is going to be a big year for these systems. Like business intelligence data, CRM systems have evolved to a point that they are “real-time” platforms for engaging customers at multiple stages and environments.
She suggests that, “Mobile CRM solution support will continue to rapidly evolve, and every CRM vendor has a mobile offering. But assembling the components of an effective mobile CRM solution to meet the precise use case for a specific type of mobile worker, or customer interaction, requires navigating a complex set of technology, process and people decisions.”
This trend dovetails nicely into her trend number 6, “Social Will Connect At All Stages Of The Customer Life Cycle.”
Although she mentions “Forrester’s approach”, which is a report that you can purchase from them concerning their “POST” model for social engagement strategy development, the point really is that social of some form or another has touch-points throughout your customer’s life cycle and has become a critical source for both data and interaction.
She suggests, “Social technologies have changed the way businesses interact with their customers. Yet, nearly 10 years into the social media boom, many executives are still reactive about adopting social media technologies, instead of focusing on the goals they want to accomplish. We predict that more companies will use a strategic approach, such as Forrester’s approach, to leverage social technologies across all stages of the customer life cycle.”
Now POST doesn’t necessarily tell you which social platform to use nor does the system favor making that decision before you take two other steps first. Number one, according to the report’s summary, is “First, examine the Social Technographics® Score of your target audience.” You can really sum that one up with “know your audience”. You are welcome to use any methodology to do that, but gathering and carefully understanding your prospects and customers has to be the first step in any successful marketing effort or plan.
Second is a simple one, “choose your objective”. CRM as a discipline and CRM systems can be used in an almost unlimited number of ways, depending on their features, this means that your objective, and how you use CRM to achieve it, is unique to your organization. It is also dependent on what you discover about your customers and prospects in step one.
Here again it is more than likely that your social touch-points will increasingly be engaged by your customers on mobile devices. This is now true in business, not just consumer adoption, and with both Mobile CRM can provide you with a one-to-one channel connecting your marketing and sales directly to buyers.
The entire set of 14 trends are an eye opener and 2014 will be a busy year for both buyers of CRM systems and suppliers of CRM systems and services. Over the last 2 years there has been an explosion of both on-premise and cloud based CRM and marketing automation products, so buyers have their work cut out for them sorting through and finding the best one for their unique needs.
Experts are out there and if you need to better understand both your options and how they fit into your business, consult one. The best are companies that have strong histories in mobile, business intelligence and CRM planning, development and management. You can go with more expensive local or regional options or you may want to explore outsourcing to offshore experts where costs often are lower.
Big year for digital marketing and the systems to support them. Even more so for SMB as many of these new systems are targeting them and offering expensive system features for a fraction of the what the older more established platforms cost. The sooner you find your place in this digital marketing revolution, the sooner you’ll get a leg up on your competition.
This year’s Mobile World Congress included a lot of buzz about “wearable tech”. From smart glasses to smart watches many tech companies and developers are eyeing these new devices as opportunities for new business.
Take for example Samsung’s “Gear 2” smart watch. Samsung was at the conference with one primary mission…get developers working on Gear 2 apps. knowing full well that wearable tech is only as good as what it can do for the user and that means apps.
App development is so important to Samsung that in his keynote address for the company’s “developer day”, Samsung’s Media Solution Center head, Curtis Sasaki, introduced tools that allows developers to create apps for both their wearable tech and Galaxy S5 line of smartphones. Key in the presentation was the software development kit for its Gear 2 smart-watch, which is based on the Tizen operating system, and an SDK for its S Health application. It also launched a Gear Fit SDK that will allow developers to make apps for their Android devices that can interact with the Gear Fit.
According to Sasaki, “One of the key goals was to make it really easy to make Gear applications. ” He also said that the Gear 2 software, while running on Tizen, is ready to interact with Samsung’s Android device because of a widget that runs on the Gear 2. Developers also can create standalone applications for that device. In the case of the Gear Fit, a host application in Android is always required.
Tizen is an open-source software platform based on Linux that is an alternative to Android and Samsung is using three new wearables as a way to launch the new platform. During the conference the mobile device manufacturer unveiled the Gear 2, Gear 2 Neo and the Gear Fit health band. The wrist-base device is Samsung’s first “fitness tracker” and the band includes health-tracking software along with a pedometer functionality. Like the Galaxy S5, it also has a heart rate sensor on board. Unlike most dedicated fitness wearables, though, it also pushes notifications from your smartphone, including third-party alerts from non-Samsung Android apps.
Google can be sighted as the first to truly push this into the public via its Google Glass concept. Their current push is to make the Android operating system available to any type of wearable tech and has plans for releasing its own smart-watch, to be manufactured by LG Electronics. Like Samsung they are leaning heavily on independent and enterprise developers to build apps and are making Android available for free to phone and tablet makers. In the near future they are also releasing a new software development kit based on Andriod specifically for manufacturers and developers of wearable tech.
Unlike Samsung who is pushing an alternative to Android because of perceived instability issues and, because it is proprietary to their products, drive users to buy from only their line of devices, Google wants to keep it “open”. They understand that 80% of smartphones now run Android and, after overcoming some early issues, the expect that 80% of wearables will also be running Android.
From healthcare to enterprise level access to business intelligence, wearable tech is increasingly driving new app development and gives even small companies a reason to get in the game early. For healthcare this could mean direct, ongoing monitoring of patients by way of a wrist-based device like Gear Fit. For a CEO it might mean being able to now call up a complete report vital to a negotiation by way of their Google Glass in real-time, anywhere it is needed and at any time. No more running back to the office in order to interface with your data and information via the desk top ball and chain.
You’ll need help at first and many development companies have been following these new technologies right from the start. Considering how many ways business can be served by utilizing mobile computing and potentially wearable devices, consulting an expert for ideas could give you a “leg up” on your competition.
It seems clear thought that just like the smartphone, wearable tech will create further demand for companies and developers alike to create applications across a wide range of business needs and consumer application markets.
If you read the McKinsy Global Institutes report, “Disruptive technologies: Advances that will transform life, business, and the global economy”, you will note that of all the disruptive technologies the very most disruptive will be the growth of the “mobile Internet”. In truth they are simply saying that the Internet we have always known now has fully absorbed mobility and the PC is no longer the preferred device for accessing it.
It is important to note that McKinsey looks at disruption in technology with the respect that it should stating, “Technology is moving so quickly, and in so many directions, that it becomes challenging to even pay attention—we are victims of ‘next new thing’ fatigue. Yet technology advancement continues to drive economic growth and, in some cases, unleash disruptive change. Economically disruptive technologies—like the semiconductor microchip, the Internet, or steam power in the Industrial Revolution—transform the way we live and work, enable new business models, and provide an opening for new players to upset the established order. Business leaders and policy makers need to identify potentially disruptive technologies, and carefully consider their potential, before these technologies begin to exert their disruptive powers in the economy and society.”
As an argument for picking mobile Internet for the number one spot included “the rate of technology improvement and diffusion”, which compared the cost of a super computer in 1975 ($5 million) versus the cost of a 2007 iPhone-4 ($400) with the same computing performance (MFLOPS). Add to that the growth of smartphones and tablets since iPhone’s launch, a growth rate of 6 times the revenue (as of May 2013), very nearly doubling every year. At that speed it is certain that the dominant method for using the Internet, whether through a home network and landline access or while away via Wi-Fi or Mobile access, will be a mobile device.
The report also mentions that there are 4.3 billion people remaining who are not connected and most likely they will eventually connect by way of mobile Internet access. It also noted that 1 billion transaction and interaction workers are engaged by way of the Internet (nearly 40% of the total global workforce), $1.7 trillion in GDP (globally) are related to the net, and that net-based interaction and transaction workers represent 70% of global employment costs.
These later stats are considered “illustrative pools of economic value” that longer term could be impacted by the growth of mobile Internet access, access devices and mobile based services.
As an example; mobile business intelligence (Mobile BI), up until a handful of years ago, was seldom considered. Most thought that business intelligence would always be a small group of analysts sitting in front of big PC monitors spitting out dashboards that no one understands. Any form of mobility was thought to be a luxury and what did exist didn’t exactly drive end-user adoption beyond those data scientists. Today most of the traditional BI systems and platforms have introduced mobile and software offerings to provide for BI on mobile devices.
According the TEC’s 2014 Mobile BI Buyers Guide, citing an earlier Citrix report, mobile is important now and will be more important in the future. In it it states; “mobile BI apps nowadays occupy a significant fourth place among other productivity apps in an organization. Information workers, ranging from top decision makers to middle management and operation workers, are increasingly using mobile BI apps for reviewing and analyzing data, thus replacing the traditional paradigm for performing data slicing and dicing using a PC or laptop.”
“Moreover, it appears that mobile BI apps are a natural fit with other mobile systems within the corporate mobile ecosystem. Due to their visual nature and ability to provide consistent and intensive interaction with other business systems, mobile BI applications enable users to perform data discovery and analysis using a user-friendly interface, while providing extensive collaboration features and enabling the sharing of rich content (e.g., voice, images, and text). Mobile BI capabilities can either be embedded within mobile BI offerings, or can sit nicely on top of enterprise collaboration platforms. The latter case is seen with platforms such as Microsoft SharePoint and TIBCO tibbr, which enable rich embedding of mobile BI and analytics as part of their collaborative offerings.”
Businesses of all sizes will need to evaluate carefully the disruptive nature of mobile and consider, like the BI community is, what other systems, end-users and opportunities there are to bring the power of mobile access to. There are many excellent firms across the globe already down this path and having a comprehensive understanding of the road ahead. Accessing one of these to aid you in evaluating how emerging technologies can be maximized for your business and harnessing the disruption may set you ahead of your competition.
However it is very clear that mobile is not just here but going to be a disruptive and world changing technology across many domains for a long time to come.
It was interesting to note in mobile application developer Lauruss Infotech’s recent press release that the subject was their launch of a “responsive” website on which they would be selling their mobile apps.
The reason this caught our attention was that a responsive website is designed to function on any device, whether it is a traditional PC, a tablet or a smartphone. Soon they may have to accommodate display on some wearable devices but that is a bit further down the road.
But for some time now there has been debate about whether you should be developing mobile apps using native device programming or by using something like html5 to allow it to be Internet based. Now many are adding the fact that a mobile app could be effectively deployed by using a “mobile first” responsive web design.
The idea that this is a reasonable track to follow for mobile app development is strongly supported by the success stories of websites that have transitioned their retail site to a responsive design. Case in point would be a company like MandM Direct who reported that conversion rates among customers using mobile phones and tablets were up and that bounce rates had dropped “significantly” thanks to the responsive design coupled with a mobile first delivery technology.
In fact, according to the companies IT director Graham Benson, “In my opinion, a lot of retailers build mobile and tablet apps to compensate for the fact that they simply can’t deliver an optimized web experience on mobile phones,”. This suggests that, at least in his opinion, either you should have a responsive website and skip the apps or, better yet, have both.
There is a lot of room for growth in the area of responsive websites. According to a report by Restive labs, a responsive web framework developer, only 3% of Enterprise level websites are responsive and only 15% of all webs are “fully” responsive. Fully responsive according to them are webs that “require no redirection for optimal use on a mobile device.”
“The takeaway from this report is that enterprise websites, and perhaps websites in general, still have a long way to go to attain readiness for a world where the smartphone or tablet is the primary web access device,” Obinwanne Hill, founder and CEO of Restive Labs and author of the report said. “It’s hard to believe that almost 4 years after responsive web design and 10 years after web performance optimization came to prominence, there is still very low adoption of these important techniques.”
Considering that the construction of a responsive website isn’t necessarily difficult for most competent web development firms and equally developing mobile apps based on responsive web technologies and frameworks are also becoming a well developed discipline. Very nearly any business of any size can have one for a cost that is only driven by how deep or feature rich the site itself might be.
Enterprise and mid-size firms should be looking at moving their eCommerce and front facing websites to a responsive platform sooner rather than later as the ease of use on multiple devices is driving adoption of them faster every day. Consulting with an expert firm, knowledgeable in how to build responsive webs and applications, may be your best bet if you lack the in-house expertise or time to further explore the benefits.
Responsive webs were once a novelty but those days are past. Necessity is the new destination and those who don’t catch on will simply miss the boat, or the customers, depending on how you look at it.
Probably the most profound creation since the invention of the telephone and the industrial revolution, the world wide web is now 25 years old. It is important to note that there is a difference between what we usually simply refer to as the “Internet” and the world wide web.
The interconnection of networks that is the true Internet, is much older and, although it was in place and connecting a number of the early mainframe computers throughout the world, all containing some form of data. What didn’t exist until just 25 years ago was an easy way of sharing it, viewing it and publishing it so that anyone could interact and use that data.
That is when Sir Tim Berners-Lee invented the “world wide web”. While working at the Cern European Organization for Nuclear Research in Switzerland as software engineer in 1989 as a simple attempt to improve communications between the thousands of scientists involved in the project. Despite the fact that both the global system that connected computer networks was in place and computers were getting into the hands of new “end-users”, there still wasn’t a useful way to make it all “browse-able” by a wider audience.
Berners-Lee drawing on his computer programming expertise took the challenge and, even after his paper of the idea was poorly accepted, went on to create the HyperText Transfer Protocol (HTTP), HyperText Markup Language (HTML), and built the very first browser he dubbed WorldWideWeb. By 1993 Cern allowed Sir Tim’s technology to be used by all and only a few years later millions of people across the globe were hooked up to and hook on the information that the “Internet” contained.
There are reasons to reflect a bit on the creation of the web itself but many folks worry about the basic foundation that the WWW runs on…the Internet. Development for the web has in 25 short years become the foundation for most of our economies. Billions of users access it from hundreds of different devices and soon millions. Some of the foundation we have become dependent on to deliver the web though are becoming strained.
Berners-Lee, and others who followed shortly to further develop the web, at first didn’t envision the amazing amount of permutations of their basic idea would spring and how many would use it for criminal enterprise. Their basic mission was to share broadly vast amounts of information but didn’t think that eventually those same systems could be used criminally or worst yet as weapons of war.
Increasingly we are hearing calls for adoption of several new approaches to the basic infrastructures that make up the Internet that will plug many of these holes and re-build the foundation more with the current state of the web in mind. Several continue to support, no matter how difficult and expensive for network operators, the abandonment of IPv4 in favor of the more secure IPv6 now reaching an acceptable level of maturity.
Until the foundation infrastructure is addressed by adopting new measures, it will be increasingly important for companies, who depend on the web for just about everything these days, to take whatever steps they can to better secure their websites until a longer term solution is implemented. Having a third party inspect and analyze your web and data warehouse security, taking into consideration all the the threats both existing and projected, can give you an objective look and set of recommendations that might save you from data theft or other downside that an aging infrastructure can create.
But despite the fact that the Internet could use some upkeep, the world wide web is going strong and continues to be one of the most important inventions in human history. All thanks to Sir Tim.
It is safe to say that WordPress was a revolution of sorts in the world of content management systems (CMS). Certainly not the first, many of us were developing CMS platforms as early as the late 90s in one-off web development projects built from the ground up. Many of the code snippets themselves ended up on open-source sites all over the web thus laying the groundwork for the first truly open-source blog CMS, WordPress.
According to WordPress’ website counter there are 76,793,570 websites running their CMS. As of March 2014 60% of all websites running a content management system are running WordPress.
That is why the recent revelation that the long used, but mostly useless, ping backs (also known as a track backs) are providing a “back door” to hackers that use websites running WordPress as part of a distributed denial of service (DDoS) attack. The first discovery of it affected over 160,000 WordPress powered websites and used them as DDoS zombies to launch the attacks from their webhost’s servers.
First reported by Daniel Cid of Sucuri, Inc. a client website had gone down due to a DDoS attack and eventually increased to a point that their webhost had to shut them down completely until the attack source could be identified and removed. What they discovered was that one attacker was using thousands of well known, popular and, at least from an average security viewpoint, clean and safe websites to launch their denial of service attacks from. How? By using a simple ping back request to the XML-RPC file containing the method call for a ping back.
Cid recommends that you disable the ping back function but the best way to do that is to create a plug in for that purpose. The plug in needs to add a filter the code for which he has here and Sucuri has a nice little tool for checking your own web to verify if you are being used for a DDoS attack or not.
In the early days of search engine optimization ping backs were often recommended as the traffic would aid in improving your search positions. Now Google and most of the other search engines are not using this data preferring a more rich content and contextual search to drive results. Ping backs simply provide one website an acknowledgement that another has linked to it. That’s it. Nothing more. It doesn’t improve SEO and now opens up a vulnerability that could be used to do more than just launch attacks on other websites.
If you are one of the 76+ million WordPress users and your company is dependent on both your website and keeping it secure, you may want to consult experts to provide you with the needed security precautions. Although the beauty of these systems early on was how easy they are to use, even for a novice webmaster, they are increasingly complex and WordPress is a big target that hackers find hard to resist.
Either way, you don’t need ping backs anymore and you should check regularly to make sure your site isn’t being used to attack others. It is still important to link to sources, particularly in the news and blog world, but the now useless channel that provides acknowledgement of that link is no longer needed.
With the 2014 SharePoint Conference in Vegas wrapped up it left in its trail many new features, capabilities and updates that the SharePoint customer base must now consider or incorporate. Many are long awaited expansions of the integration of SharePoint and Office 365, many are further moves towards more robust cloud-based services.
One example is the availability of Service Pack 1 for SharePoint Server 2013 for download. SP1 is reported to have several improvements to the web-based content management and collaboration tools valued by their enterprise customer base. One feature is aimed at making it easier for on-premise deployments to get started in the cloud. To quote the “Office Blogs”, “With SharePoint Server 2013 Service Pack 1, you can connect your on-premises SharePoint 2013 servers to the cloud and to turn on OneDrive for Business or Yammer as the first workloads in the cloud and run those alongside existing SharePoint investments.”
“But we [Microsoft] wanted to go one step farther. So today, we are introducing a new very attractive priced OneDrive for Business Standalone offer. If you aren’t already using OneDrive for Business – now is the time to take your first step to embracing the cloud.”
Jeff Teper, Corporate VP at Microsoft showcased the larger picture piecing together these many new features and integration dynamics in his keynote address to the roughly 10,000 customers and partners in attendance.
“Today, the world has become a giant network where connections make information more relevant and people more productive. Most companies, however, are not working like a network, which we believe is vital for their ability to improve collaboration and respond to customers, competition and market changes,” Teper said. “The new Office 365 experiences powered by cloud, social, mobile and big data technologies enable people and teams to find the right connections and most relevant insights to get more done.”
One new tool rolled out was Office Graph, a new Office 365 intelligence fabric that remains unseen to the user but is constantly analyzing content, interactions and activity streams and mapping the relationships using machine learning to intelligently connect and surface the most relevant content to each user.
Office Graph will be the pathway that provides a whole new set of user experiences including a feature code-named “Oslo” which Microsoft describes as “A new application powered by the Office Graph and code-named “Oslo” was previewed onstage. It uses personal interactions and machine learning to surface a view of the most relevant, timely information for each individual using Office 365 across SharePoint, Exchange, Lync, Yammer and Office.”
Alan Lepofsky, vice president and principal analyst, Constellation Research commented that “Many organizations struggle with bringing together their content creation, collaboration and core business applications. Today’s Office Graph and ‘Oslo’ announcements describe an integrated future that could greatly improve the way Microsoft Office 365 users will get their work done, with easily discovered insights about important activity across their organizations that personally affect or may be interesting to them.”
With SharePoint still having one of the largest user bases in the industry it appears from the conference they are going forward with expanding the power, reach and mobility of these platforms. For current SharePoint owners successfully employing these new functions and educating an end-user-base in their use may require outside expertise to aid them. For smaller businesses this can be a costly proposition but it need not be if they work with companies who have North American based project managers but lower cost experts and outsourced personnel that can be called upon to make necessary changes to current deployments of SharePoint.
Without a doubt Microsoft is flexing all of its cloud muscle and is building a bridge for on-premise users to maximize their current investment in SharePoint by providing tools to connect them to critical cloud services. With this being a pivotal year, companies considering the enhancements shouldn’t take too long to begin catching up, given the speed with which these technologies are evolving.
Author’s Note: This is an article written for a client in the IT and Application Development industry.
It’s safe to say that Microsoft has a long track record for using strategic acquisitions to bring together the ammunition it needs to go after a competitor. And their target this time is an old foe, Salesforce.
Three new Microsoft properties are lined up to take on SF’s Service Cloud, ExactTarget and Radian 6. Parature and MarketingPilot were acquired by Microsoft over the last two years, with the purchase NetBreeze announced in January 2014. Each aims at providing alternatives to SF services with MarketingPilot serving as the marketing automation, resource management, campaign management, as well as, budgeting, media planning and buying component.
Parature provides cloud-based customer service tools and allows Dynamics to now aid in engaging your clientele. The most recent purchase, Netbreeze, brings social analytics and monitoring to aid in driving the use and success of the other components.
The strategy itself was well laid out by Microsoft Business Solutions President Kirill Tatarinov in the Convergence 2013 keynote address.
Tatarinov addressed how business leaders can re-imagine the way they engage with customers, build brand relevance and collaborate with employees to stay ahead of the changing roles across all levels in business. Observing the increased influence of marketing and other areas of business over a company’s technology investments and the value of collaboration, Tatarinov reinforced the idea that business functions can work more effectively when they unite with their IT counterparts.
Tatarinov noted that Microsoft Dynamics is uniquely positioned to serve as a catalyst for unity, and he announced new advancements in integrated marketing, embedded social capabilities, and new cloud and mobile scenarios enabled through Microsoft Dynamics solutions that will help businesses unite with their customers, unite their organizations, and unite their people and technology.
“To realize the promises and possibilities of a world ahead, organizations must be united” Tatarinov said. “Microsoft Dynamics solutions re-imagine what’s possible for businesses, helping them unite to unlock innovation and creativity in people and to enable more meaningful experiences for their customers. When a business is truly united, great things happen.”
Marketing Pilot has been renamed to Microsoft Dynamics Marketing an integrates, as the other services do, with either cloud-based or on-premise deployments of Dynamics. Dynamics Marketing features a visual designer intended to make setting up campaigns easier for marketers as well as overhauled lead scoring and management capabilities that’ll integrate with Dynamics CRM.
Netbreeze also has a new name, Microsoft Social Listening. It will enable Dynamics CRM users to track the crowd’s opinion of brands, products, services, and competitors across various social channels. Netbreeze’s tech could naturally process sentiment across more than two dozen languages, which will translate to this new Microsoft offering. Cloud subscribers will get Social Listening for free but onsite customers will pay, according to Microsoft, an “incremental cost.”
Dynamics will incorporate Parature via a “unified service desk”. The “desk” provides customer service agents a single console for CRM and billing integration.
Whether your business is using Dynamics in the cloud or onsite, these new features will also require greater insight into how best to integrate and then use them to the benefit of your bottom-line. Microsoft insists that the expansion of the platforms capabilities will be a “uniter” of sales, marketing and IT, this can only happen if it is properly deployed, users are well informed and the system is well maintained.
For SMBs considering external experts with solid track records in both onsite and cloud based CRM, social and marketing automation platforms is a wise move. Although every new capability that each of the major CRM providers is aimed at improving your outcomes, not knowing how to bring these internal groups together can lead to unsuccessful outcomes.
Salesforce and Oracle still hold the top spots but Microsoft is coming on strong. Whether you use customer relationship management (CRM) tools or are planning to, Microsoft Dynamics CRM has certainly become a platform to consider after these new additions to its capabilities.
Or is Big Data simply one of two “camps” that exist within the business intelligence community as Eric Lebson and Ian Christopher McCaleb point out in their Holmes Report piece “Speaking Intelligently On Business Intelligence”. They rightly look at more traditional business intelligence being modeled more after law enforcement or journalistic intelligence gathering and show that digging in to big data is a much different type of effort.
They see some value in what they describe as input that is more based on human intelligence where it is ultimately data gather from human interaction. This is juxtaposed with the another government intelligence gathering effort, the NSA, where huge metadata sets are crunch looking for patterns that can lead to conclusions.
In their piece they write: “One distinct camp, the “big data” camp, focuses on the analysis of large quantities of data to develop analytic products. Huge reams of data are swept up by “bots” and crunched by any variety of software applications to provide products such as bench-marking, targeted reporting, data and text mining, performance measures, prescriptive or predictive analysis, statistical modeling, operational assessments and the like. Think, for instance, of a major beverage company researching the advertising purchasing patterns of a competitor to try to divine where that rival is focusing its growth strategy.
Public opinion polling too factors into this camp’s work as a means of making sense of the attitudes of large or specifically identified demographics.
When polls and data analytics are cross-referenced, a company or client may gain a greater context for understanding and operating within their market. When the term ‘intelligence’ is applied to this approach, it is often in terms of ‘market intelligence’ and the ‘intelligence’ in question is akin to what the NSA does when they look for patterns and linkages among huge volumes of communications meta-data.”
As this “camp” grows in influence within organizations some fear that without careful oversight even marketers can overstep in their quest for an audience “secret”. The U.S. government is concern enough to have convened a MIT workshop to study the rapidly emerging impact of big data efforts. White House adviser John Podesta, head of the presidential study on the future of privacy describes it this way, “We’re undergoing a revolution in the way that information about our purchases, our conversations, our social networks, our movements and even our physical identities are collected, stored, analyzed and used.” He goes on to say, “On Facebook there are some 350 million photos uploaded and shared every day,” he said. “On YouTube 100 hours of video is uploaded every minute, and we’re only in the very nascent stage of the Internet of things, where our appliances will communicate with each other and sensors will be nearly ubiquitous.”
Regardless of concerns this is a train that has already left the station and the data is only getting more abundant and more critical to the future of most businesses than ever. Healthcare stands to gain as the information silos there get broken down and by crunching all the structured and unstructured data there could lead to better patient outcomes. None of us can go back as our own “data footprints” are deep, wide and everywhere.
If you are conducting business intelligence and you are not considering your needs as far as utilizing what’s buried in that gigantic pile of data…you may be left behind. If you are not incorporating business intelligence into your day to day organizational guidance at all, you are flat out doomed. This is why many companies are turning to outside experts in these fields to create short and long term strategies for warehousing and then mining that data pile.
Many IT departments are stressed as the technology needed to do the work is either new to them or they simply need a more objective look at their business intelligence systems. Some already are stretched thin and taking on a very IT intensive effort to bring big data insights to analysts and decision makers may be more than they have manpower to accomplish. Both can easily outsource aspects of the planning, implementation and support to qualified firms and may find even cost savings overall by doing so.
Either way “Big Data” still remains a subset of business intelligence and not the total picture, yet. Given more time it is not unlikely that big data will join artificial intelligence and the Internet of Things as being a major part of our collective business efforts in the near future.
Author’s Note: This is an article written for a client in the IT and Application Development industry.
We wrote about this union not too long ago but ShoreTel’s recent unleashing of its sizable partner channels has increased the importance of this new addition to the already robust SF platform.
As much as we would like to think that “the phone”, that old standard method for engaging prospects and customers, has become extinct it just isn’t true. Although the days of dinner interruptions by siding sales folks may be gone the B2B universe is certainly still talking to the audiences they are selling to.
Plus now it’s mobile and VoIP, two platforms that have vast capabilities for tracking and maintaining data, that are still one of the dominant forms of sales engagement. This is what ShoreTel tying their unified communications platforms to Salesforce CRM is all about, harnessing that data and bringing it into their “closed loop” marketing efforts.
According to their February 18th press release on the subject ShoreTel suggested that, “By integrating two mission-critical platforms – the communications platform and a company’s critical enterprise application – businesses now have comprehensive reporting on the effectiveness of all communications and customer interactions. ShoreTel’s single source view provides insights that enable higher productivity and more effective collaboration of sales, support and marketing organizations for greater customer satisfaction and better business results.”
In many ways this plugs a hole in the content driven and marketing automation platforms that facilitate the entire conveyor that moves leads to contacts, contacts to opportunities, and finally opportunities to sales.
“Finally, phone data is aligned with business data in the CRM system. By integrating the ShoreTel communications platform with Salesforce, one of the most valuable business applications, customers gain new insights, more effective collaboration, higher productivity and better business results,” said David Petts, senior vice president of worldwide sales at ShoreTel. “Automatic logging of all sales activity, regardless of a sales person’s location or device, together with prompts for agents to schedule follow-up actions can lead to closing deals faster and higher overall sales team productivity. For example, a road warrior using a mobile phone with ShoreTel Mobility software will still have the activity logged.”
The applications, ShoreTel for Salesforce and ShoreTel Sky for Salesforce, provide integration between Salesforce and the ShoreTel on-premises IP-PBX system, the ShoreTel Sky cloud-based phone system, ShoreTel Workgroups, and ShoreTel Sky Contact Center.
As ShoreTel has now given their 600 North American channel partners the nod to start offering ShoreTel’s cloud-based offerings, this integration into Salesforce should be very attractive to both on-premise SF users and those of the marketing cloud. Many companies may find that they need additional input from Salesforce experts on how to maximize the union of these two powerful systems.
Experts like this can be an asset in planning, deployment and training both marketing admins and sales end-users. There are also often unique needs on a company by company basis that can only be addressed by a development team and applications meeting some specific organizational business or customer need.
Nonetheless, the union of ShoreTel and Salesforce will make two already powerful platforms even more important to your sales success if you are one of the many users that now have this critical integration available to them.
Author’s Note: This is an article written for a client in the IT and Application Development industry.
Announce last year the deal between the two rivals, Microsoft and Oracle, allows Oracle apps to be certified to run on Windows Server, Hyper-V and Windows Azure is being cemented by the software becoming available on Windows Azure as of March 12, 2014. Currently Microsoft hasn’t been charging for Oracle software running on Azure as it was not the final code. However as of March Microsoft will start to charge for the “license-included” Oracle software.
The pricing page states, “Beginning March 12, we will charge for the Oracle software running in license-included Oracle VMs in addition to our charge for the Windows Server VMs in which the Oracle software runs. Prices are listed as hourly rates, and we will bill for the Oracle software based on the total number of minutes that your license-included Oracle VMs run during a billing cycle. Additional VM charges will be incurred based on the size and type of underlying VM you are running. After the preview period ends, any license-included Oracle VMs that remain deployed will automatically be billed at the new rates.”
The pricing, shown in the figure below, is for the Oracle software only and does not include the cost of the Virtual Machine it runs on as that is billed separately.
Now this may seem like a case of “strange bedfellows” but it is following a logical trend in the further development of “the cloud”. Microsoft is simply acknowledging that the number of systems and software required by an exploding number of technology centric companies means they need to reach out to a much broader audience. It used to be that they focused only on enterprise employed professional coders as their main focus, but the community of developers has grown and the range of languages, frameworks and tools has grown as well.
This is why Tim O’Brian, Microsofts General Manager for Platform Strategy, suggests that the best approach for the software giant is to embrace and invest in non-Microsoft technologies and ecosystems.
So the inclusion of Oracle apps is just another of a long list of tools and platforms Microsoft intends to provide support for in their server and cloud products. For some companies, even those with on-premise or local data center deployments, this gives them even more options for utilizing these powerful software applications affordably and with a bit more flexibility in hosting and developing for Oracle apps.
Making moves like this to different environments takes a great deal of expertise though in both the software applications and the hosting environments they live in. There are a number of firms who have deep expertise in both Oracle and the dozens of server and cloud options for hosting. Consulting one of these firms could save you a headache or two and some precious budget dollars.
Another “something different” musical post. The last one got such good reception, probably because it wasn’t boring tech stuff, I thought I do it more often.
From my SpinDizzy & the Fusion Clowns CD “Do Electric Clowns Dream”.
Featuring the talents of:
Bass, Original Concept – Mark (SpinDizzy) Bowen
Saxes, Arrangement – Marty (Sven-Martin) Keil
Piano – Rich (Mr. Jazz) Cassenti
Drums – Ludovic (Big Foot) Le Squer
Guitar – Greg (Romby the Sidekick Clown) Rombal
According to the “Magic Quadrant Report” for 2014 the market for business intelligence (BI) and analytics platforms is predicted to grow at a rate of 7% through 2017. Some analysts have predicted even faster growth in cloud based BI solutions. According to analyst Tom Pringle the current marketing is worth $85 billion and by 2017 the cloud BI market will be worth at least $4 billion.
Part of the explosion is driven by companies now faced with a need to rapidly access data in a fashion that allows more direct use by decision makers. Many are integrating systems to run in Excel to increase end-user adoption (Microsoft’s cloud offering does this via Office 365) and some are looking at providing access to data via mobile devices.
In order to make this work the ability to customize the data and how it is presented is becoming increasingly important. One industry expert describes their end-users as “drowning in dashboards”, which have become the common way data is aggregated and displayed. Sifting through these to find only the ones critical to you and the decisions you need to make isn’t quite that easy. This tends to drive down adoption and the effectiveness of your BI system.
Gartner sees many of the industry leaders overcoming these issues by way of acquisition and many have made recent buys of assets that would allow for greater levels of functionality and more refined methods for sorting based on user needs, not one size-fits-all.
In his commentary on the future of the cloud BI market Pringle calls the concept of driving business intelligence use and tailoring its access features “business-led BI”.
He writes, “They are, in a nutshell, BI that is easier to use, cheaper and delivers results faster. The continuous cycle of technology development has played its part in making business-led BI a reality. It is not, however, just the tools themselves that are changing, but the way they are accessed. For the first time this year, my [Pringle’s] market research included cloud BI as part of the estimates of spending in this area.”
Although all see growth in cloud-based systems there is still a large and growing investment in on-premise systems thanks primarily to concerns over data security. Nonetheless even these systems are either being adapted or are adaptable, by capable developers, to meet even the mobile, user customized and more “real-time” needs of modern decision makers.
During the Gartner Business Intelligence and Information Management Summit in Sydney, Dr. Rado Kotorov, vice president of product marketing for Information Builders had this to say about the traditional BI user and the emerging group, “We always interpreted BI and analytics as a job everyone had to do, which isn’t wrong. But the difference is analysts go through the analytics and data from an analysis perspective and often take their time, while professionals make their decisions based on tacit professional knowledge,”
He also used a new mobile tool as an example of how he sees the changing need of end-users, “Google Glass has been built to target professionals where they cannot look back at their information, instead need that information right in front of them so they can make decisions almost instantaneously,”
Companies large and small who have already invested heavily in their BI deployments can take advantage of a number of firms who have both planning and development experts that can assist you in building these newer capabilities into your current system.
No matter where your business fits into this new world of BI, more companies need to take advantage of data they have accumulated than less. Long term, thanks for a growing supply of lower cost development, adding business intelligence to your mix won’t necessarily over burden you budget, but it will boost your bottom-line.
It’s Friday and not a day to dwell on technical subjects for a change. So instead today’s post is musical. The result of my participation in an online community of recording musicians, Kompoz.com, is this little ditty reminiscent of the funk bands back when I was young and foolish. I’m on bass in this one (along with another fine bassist Alex also in the mix), Marty on horns, Ken on guitar, and Kev on drums. Please enjoy a Friday Funk Tune entitled “Milk Carton”.
Red Hat’s big news on February 11th was the launch of JBoss Data Virtualization 6, making the feature rich platform for data integration and unified access across disparate data sources available to end-users who wish to turn so called “big data” into actionable information.
According to Red Hat’s release, “Data is spread throughout organizations in various big data and traditional stores such as Apache Hadoop, relational databases and NoSQL stores such as MongoDB. Integrating and transforming data from these disparate stores can be a challenge to access or productively use. The difficulty of making this data accessible to external applications, such as analytics and business intelligence software, can be a barrier to effectively leveraging these technologies to extract valuable and actionable information from the data. JBoss Data Virtualization helps solve this problem by allowing for simultaneous access to these disparate stores.”
Although it is more of a positioning idea than an actual change in data warehouse systems, Red Hat wants you to begin to think of data not as an inventory item in a warehouse and more as part of a supply chain that feeds a larger set of decision-makers and makes better use of structured and unstructured data.
According to Syed Rasheed, senior principal product marketing manager at Red Hat JBoss Middleware, “Organizations are no longer suffering from a lack of data, they’re suffering from a lack of the right data. In today’s data-driven world, it is not only analytical applications that need to access data from diverse sources, but operational and transactional applications and processes as well. Business leaders need the right data in order to effectively define the strategic direction of the enterprise.”
He adds, that “the reality is that data in most organizations is distributed across multiple operational and analytical systems, including Apache Hadoop, relational databases, and NoSQL stores such as MongoDB. With social media, cloud applications and syndicated data services leading to expanding volume, variety and velocity of data, many organizations are realizing that physical consolidation or replication of data is not practical for all data integration and business agility needs.”
Because Red Hat is essentially creating a “mostly” open-source platform aimed at maximizing the usefulness of the now commonplace large data sets that companies have and continue to collect, this addition will open this up to business buyers and not just IT and developers. More costly systems make using this data prohibitive for many smaller organizations and this will allow for data to be accessible to a far larger set of users than many companies could afford.
Its importance is underscored by Red Hat, “Information management and delivery are critical functions of modern business operations,” said Craig Muzilla, senior vice president, Applications Platform Products Group at Red Hat. “If organizations are going to have any success translating meaningless data into actionable information, integration tools like JBoss Data Virtualization play a key role in making the data readily accessible.”
It is important for firms to understand that “big data” and “business intelligence” now walk nearly hand in hand. Very nearly all companies wishing to compete in the new digital universe and win with customers who are more mobile, have dramatically changed buying habits and are now speaking back to marketers via social media, will sooner rather than later need to tackle how to take these still growing data warehouses and put them to work on behalf of the business.
Companies who want to harness this data and are considering using open source tools to do so or have multiple data stores like Apache Hadoop, relational databases or NoSQL stores like MongoDB may wish to engage outside expertise to plan and deploy a Red Hat solution to “big data” mining. Although this is relatively new the basic components that comprise the platform are not and there are a number of low cost resources for building your data virtualization system.
This new addition to Red Hat’s list of integrations should make that easier and more accessible to firms large and small.
Two new additions are joining companies that are integrating their services with the Salesforce cloud in order to support customer service as well as sales. Announced on February 18th this year both ShoreTel and SearchBlox bring some powerful new tools for customer facing support folks.
SearchBlox is an enterprise search solution that allows users to search not only Salesforce but multiple other data sources without ever leaving the SF platform. For users in the past, searching beyond Salesforce meant going back and forth between multiple applications and systems to find answers to customer inquiries. Now a single search from within SF will seek out the answer within multiple data and content stores.
“A single search term can now be used to recall information from Salesforce itself and external sources such as websites, file systems, data/news feeds, social websites and custom applications. Search results are returned to Salesforce users on a single screen, either directly through Salesforce itself or through the SearchBlox™ web interface,” says Timo Selvaraj, Co-Founder of SearchBlox Software, Inc.
“This prevents Salesforce users from having to juggle multiple data and content repositories, allowing them to assist customers faster than ever before and ultimately expand their bottom line. SearchBlox™ literally brings all of their data sources into one place, by allowing federated search from a single screen.”
And to facilitate the call and everything else under the category of “unified communications” ShoreTel’s integration with Salesforce, announced the same day as SearchBlox, is touted as giving SF “end-to-end communications platform integration”. According to ShoreTel’s press release, “By integrating two mission-critical platforms – the communications platform and a company’s critical enterprise application – businesses now have comprehensive reporting on the effectiveness of all communications and customer interactions. ShoreTel’s single source view provides insights that enable higher productivity and more effective collaboration of sales, support and marketing organizations for greater customer satisfaction and better business results.”
Two applications are involved, ShoreTel for Salesforce and ShoreTel Sky to provide “provide powerful integration between Salesforce and the ShoreTel on-premises IP-PBX system, the ShoreTel Sky cloud-based phone system, ShoreTel Workgroups, and ShoreTel Sky Contact Center.”
This in theory allows users to manage all their business communications within a single application and facilitating “collaboration in real-time.”
“Finally, phone data is aligned with business data in the CRM system. By integrating the ShoreTel communications platform with Salesforce, one of the most valuable business applications, customers gain new insights, more effective collaboration, higher productivity and better business results,” said David Petts, senior vice president of worldwide sales at ShoreTel. “Automatic logging of all sales activity, regardless of a sales person’s location or device, together with prompts for agents to schedule follow-up actions can lead to closing deals faster and higher overall sales team productivity. For example, a road warrior using a mobile phone with ShoreTel Mobility software will still have the activity logged.”
As it is with all things Salesforce, these new assets fit into a fairly complex to implement and manage platform. Even small to mid-sized firms can benefit from these and other Salesforce capabilities, but to avoid costly mistakes they would be well advised to seek out Salesforce experts, particularly in the planning stage.
On occasion some companies will find a need to modify either the Salesforce applications themselves or those application and data stores these apps aim to have a connection with. A company with a strong track record in deploying, managing and developing for Salesforce could save you time and money.
Salesforce remains the number one CRM platform in the world and the additions of these new features will make it even harder for many of their competitors to keep up.
Zeus Malware and its many variants has been around for quite some time. One of its many children was used in recent news making attacks on retail giants Target and Michaels. And although it was primarily designed to specifically steal banking credentials new variants were recently caught poking around Salesforce.com.
According to Network World, a Zeus variant is targeting individual Windows-based computers in order to break into SF as the user victim logs in, “then quickly gathered up a large amount of Salesforce business data through a kind of web-crawling action.”
“It grabbed 2 gigabytes of data in less than 10 minutes,” explains Vice President of Marketing Tal Klein of Adallom, the security company that caught the invasion while monitoring one of its customers. He also noted that it’s the first time the company has seen a variant of Zeus being put to this kind of use.
In the Target case, and likely Michaels and several other high profile data thefts, the malware didn’t get in by way of Target itself but by going after a Target vendor, first infecting that individuals network. Once there it found a direct connection to Target via the channel through which the vendor’s payments were facilitated on Target’s network.
As with any other malware the victim has to be tricked into opening an attachment to an email and getting passed any anti-malware barriers the victim’s IT folks have put in place. Lately this has been done by attaching a ZIP file containing the malware but using a “.enc” file extension. Because it is not an “.exe” or executable file extension this seems to fool security products that are in place to catch malware.
There are a number of email content variations used as bait. If the end-user has too much faith in the anti-virus/malware products in place they may go ahead and open the ZIP file. Once this happens the show is over and Zeus, or “GameOver Zeus” as its most current major variation is called, is off to find a path to bigger “targets”.
IT folks should double check the logs to see how many or if .enc files have been downloaded into their networks and regularly review information like that offered by Dell SecureWorks.
The evidence that Zeus was being used to poke around Salesforce.com and innocent SF users data should be a warning that no one is safe. Why it targeted SF type data has not been discovered yet. But if it can invade some of the biggest and most secure systems on the planet, than even small users should take steps to protect their businesses from the nasty little critter.
Take advantage of the expertise of external companies who have experience both in Salesforce and other platform products as well as expertise in methods to better secure these systems from invasion. Whether your company is the target or if it is simply the way the malware gets to its ultimate target, it will cost you if you get infected by Zeus.
There is a trend towards bringing some IT jobs that were once being outsourced back in-house and the number of H-1b visa bearing engineers and developers is slowing down. Part of this is because the kind of innovative talent needed for many companies to move their tech forward seems to be here, not in countries who have focused education on more mundane pursuits.
For many companies it is a matter of using your best IT and development personnel to do the most important work, and leaving more day-to-day maintenance and operational support to lower cost outsourced personnel.
“Hybrid-SourcingTM” requires a certain amount of balance and using your higher paid internal expertise to do simple day-to-day management, low level support and other “busy work” is not a strategy that will give you the best return on your investment in them.
Although most analysts have expressed how the trend throughout 2013 was the “reshoring” of formerly offshore outsourced IT services, GM was held up as the first to do this last year, but as the year progressed it did not result in a deluge of firms returning outsourced work to internal personnel.
More companies have chosen to balance their IT resources by using a hybrid model that combines insourcing with outsourcing. If you look at the economics of your IT operation, as these companies did, you may find that what is needed is a balance between two or more of these approaches in order to get the maximum return on your IT investment.
Offshoring, reshoring, onshoring, nearshoring, insourcing, and outsourcing, with so many different choices Hybrid-Sourcing helps you to evaluate each option and choose the combinations that give you the best results.
Hybrid-Sourcing allows you to use your higher paid internal resources for projects and tasks that maximize their experience, skill and knowledge of your company, to give you more impact from these generally higher paid internal resources. Hybrid-Sourcing is directly tied to business goals and is specific to each company — meaning there is no cookie-cutter approach to determining what combination of sourcing models will work best for your company. Insourcing might work for some tasks and outsourcing for others.
This type of hybrid model works often with offshore outsourced personnel taking on those responsibilities that require competence yet not necessarily providing higher level business or technology innovation. Your internal IT investment can be then focused on innovation and the critical revenue generating aspects now common in many IT departments.
The fact that you can keep your internal experts focused on the critical aspects of your mission will maximize their contribution to your business success. As most outsourcing services also feature far lower hourly rates to utilize a more than acceptable level of competence, your less mission critical support tasks still get done but at a lower cost. This too helps to increase the return on your entire IT investment.
The kitchen is heating up for IT and it gets hotter every day. Considering how to get your resources in balance rather than just bringing it all home may be in your best interest. With as many companies that are following this model…you’ll be in good company.
Author’s Note: This article was written for a client last week. According to GeekWire’s coverage of the speech we are pleased to report the following had it nailed as far as his focus, with the exception of comments relating to the recent issues surrounding the Ukraine and Russia.
Clearly the longest running and common intranet collaboration platform, SharePoint, is well known for its many conferences throughout the world but this year’s SharePoint Conference in Vegas will feature a former U.S. President.
About a month ago Microsoft announced that former president Bill Clinton would give the kick off address to the conference focused on SharePoint, Yammer and Office 365. Clinton has always been a strong supporter of technology advancement and has in the past supported a number of efforts to bring more technology to healthcare in the form of mobile health apps and telemedicine.
It is hard to say if these will be subjects he will cover at the conference but Microsoft and its huge ecosystem of companies that use these platforms include the kinds of efforts that fit his view of technology that help folks lead better lives.
Perhaps an outgrowth of a Wired interview that featured both President Clinton and Microsoft founder Bill Gates last November. Clinton, as did Gates, spoke at length about examples where connectivity and technology were utilized to aid people in developing nations.
Regardless of the content of his keynote it is further evidence that, for Microsoft devotees that is, that they have a mission beyond just building a dominant technology company. With over 200 session covering the entire Microsoft SharePoint ecosystem and three “sub-keynotes” that focus specifically on IT Pros, Developers and Executive level decision makers; there is something of interest for entire teams.
2014 is expected to be a big year for SharePoint and Office 365 in particular with feature and development shifts that are likely to drive increased adoption of the platform by small enterprise and mid-sized businesses. Still not a cheap project and many internal IT teams simply do not have the time or experience to further the adoption of these tools, even if budget dollars are there to support it.
So for these folks, who are likely not attending the conference, all is not lost. Many companies have long track records in analyzing, planning and either migrating or deploying sophisticated platforms like SharePoint and 365. The importance of fully understanding how these tools will be used, what will best drive adoption, what might need to be developed in order to integrate with legacy systems, and what, if any, is required from a mobile device standpoint can’t be downplayed. These firms will save you precious time and money.
Attending the conference won’t necessarily answer all your questions but if you can make this one you’ll likely hear an interesting technology viewpoint from this former president. If you can’t at least check out the Wired interview from last year as it is well worth hearing Gates and Clinton’s thoughts and experiences with technology that builds solutions to real world problems.
Microsoft is never shy about its goals and the launch of Power BI for Office 365 is not any different. Microsoft’s goal…bring business intelligence tools to a billion screens.
Once the domain of analysts and data scientists, increasingly companies are looking to have critical business data presented directly to decision making end-users in a visual format that doesn’t require a degree in rocket science to understand and act upon.
To Microsoft that means using familiar tools with new features that can more directly access and use business intelligence data. Their solutions, Excel and Office 365. Virginia Backaitis, writing in CMS Wire, claims to be “bullish” on the launch which made the service available on February 10th. Her primary rationale isn’t so much about how well the new service actually works but more about how deeply embedded knowledge of Office tools are in the general population of business users.
“Everyone from high school students to stock brokers, to C-level execs, to researchers and even the heads of gardening clubs know how to use Microsoft Office tools. Is there a better way to achieve widespread adoption?” She writes in her coverage of the launch.
Nancy Gohring, writing in Cite World, likes it too but cautions that it is not an “off-the-shelf” product that you can distribute to your team and start using overnight.
She notes that “the stumbling block will be implementation and ease of use. This is not one packaged product that a group or business can buy and start using. To get the full product, businesses need Office 365 and SharePoint, as well as an administrator to manage it.”
Keeping that in mind companies who have invested heavily already in 365 and SharePoint will find the transition less stressful and the time it takes to start benefiting from the service may be shorter. Firms who have not but plan to invest in these systems and services might consider seeking external expertise to deploy SharePoint, integrate into 365 and then add Power BI. These same firms can often provide aid in getting users up-to-speed with the new tools and the business can start to benefit at a more rapid pace.
Nonetheless the tools are fairly slick, with a few bugs to work out yet and mobile use not quite where it should be. Nonetheless the ability for more users to access, visualize and share data directly with teams is an important step forward for Microsoft’s family of cloud and on-premise systems.
Whether it hits “a billion screens” or not however, remains to be seen.
Both Oracle’s President Mark Hurd and high profile CEO Larry Ellison showed up for the company’s first HCM World conference this February, underscoring the importance of this space to Oracle’s future. Hurd even joked about how rare this was for the two to appear together at the same event saying, “How many times have Larry Ellison and Mark Hurd presented at the same event other than Oracle OpenWorld? Zero. And we’re doing that here.”
Ellison’s keynote was focused on Oracle HCM’s “socially enabled” application suite, which Oracle now claims is its fastest growing cloud offering. The HCM Cloud includes recruiting, on-boarding, training, career planning, talent-review, and workforce planning functionalities wrapped up in a system that has distinct “social” roots.
“If you look at the user interfaces, it looks like a social network, and that’s good because most people know how to use social networks,” Ellison said.
He slammed the old 20th century software approach that required weeks of complicated training to perform relatively minor management tasks. And he doesn’t just want it to look like a social network, he wants it to tap into some of the power of social networks as recruiting and referral resources.
He also spoke to the importance of the built-in analytic tools; “The bad news is you get a lot more candidates that you have to sift through using these social techniques,” Ellison said. “That’s why we have these analytics that help you identify the very best candidates that you want to start interviewing right away.”
Ellison also touted the systems ability to aid in retention noting how its review process has built-in a constant risk analysis that measures right down to the individual employee. Driving it home in the closing of his keynote address he pulled no punches, “We have the only HCM system that has an integrated social network,” he said. “We’re the only one that has integrated recruiting and social sourcing inside of the core HCM system. We’re the only one with integrated learning management and the only one with predictive analytics throughout the system.”
Is Oracle HCM right for your HR department? That may be a question best answered by experts who have experience not only with Oracle HCM but all of Oracle’s products, both cloud-based and on-premise. As with all systems your specific needs will be different than other users and understanding how you are going to meet those needs sometimes requires special insight.
Still there has been an increasing interest, on the part of HR departments both large and small, to find better tools for the difficult job of Human Capital Management. An abundance of applicants mixed with a shortage of qualified candidates is causing most HR departments to seek ways to sort more quickly, get employees on-board and productive faster and keep the good ones longer.
It’s fairly clear that Ellison and company believe that solution is Oracle HCM.
Oracle CEO Larry Ellison, often outspoken and often given to move aggressively when the spirit moves him, is redefining who Oracle is actually in competition with. It used to be primarily IBM and SAP but as the aggregate collection of infrastructure elements and SaaS has emerged to create a whole new universe known as “the cloud” he sees it differently now.
In a question and answer period following a talk at Oracle’s CloudWorld conference Ellison stated “Our competitors are this whole new generation of cloud companies. We’re focused on the infrastructure companies like Amazon and the SaaS companies like Salesforce. We just swapped a bunch of big guys—IBM and SAP—for a bunch of other guys; small but agile,”
There to primarily to promote both Oracle’s cloud and on-premise systems he admits that Oracle was slower than many in moving further towards the cloud. It is odd as Ellison could lay claim, and occasionally does, to having conceived the entire cloud concept. He explained the slightly slower movement this way, “If I’m going to do one application and put it in the cloud, that’s fairly straightforward,” Ellison said. “Oracle isn’t a company that builds one application, it’s a company that builds many”.
And build many they have so as he puts it, “You can say that we’re late but it’s not because we started late, we started 10 years ago. It was just an enormous task bringing an enormous suite of apps to the cloud.”
Integration with other clouds also seems to be part of Oracle’s adaptation to bring its apps to layers everywhere. In an agreement put together in June of 2013 Oracle has released an adapter that allows the two systems to copy data between their cloud based Salesforce account and their Oracle software.
This is part of Oracle’s move to integrate clouds with their on-premise software deployments. According to Demed L’Her, Oracle vice president of product management, ”We’re encapsulating standard Web services calls into easier-to-use adapters. It will be the first in a number of connectors that the company plans to offer that connect cloud services with on-premises Oracle applications.”
Ellison’s move is one that allows existing on-premise Oracle users to maintain their investment while opening up gateways to cloud based systems to augment their overall value. Small to mid-size enterprises with on-premise Oracle deployments may wish to consult with experts familiar with both Oracle and the systems you wish to integrate.
Long term more companies than less will need to find better ways to cultivate their data investments and maximize the impact they can have on your business. Increasingly, whether they like it or not, infrastructure and Xaas companies will need to entertain “co-opetition” to provide their end-users with what they require in integrated data systems.
According to an article in Forbes recently Ellie Cachette thinks the fear of outsourcing technology development is holding back entrepreneurs. In the article she focuses on female Chief Executives like herself, but thinks fears about outsourcing technology development is wrong for all tech innovators.
She points to investors as being the first that seem to automatically dismiss the concept of outsourcing technology development. She writes in Forbes, that so often “outsourcing technology” is seen as a turnoff for investors. When a founder is trying to raise money, especially on the West Coast, the first question is often, Who is your technical co-founder?”
In fact she goes so far as to say that in some cases it is the successful start-up’s “dirty little secret”. She writes “While it’s often only talked about in hushed voices, many successful start-ups leverage outside development agencies, and until your business has high security needs, this might work for mid-term company growth as well as short term.”
Yet CIO magazine in it’s article “10 IT Outsourcing Trends to Watch in 2014” sights number two as the concept of “hybrid offshoring” heating up. In a quote from Atul Vasithsha, chairman of outsourcing consultancy NeoGroup, he indicates that “In 2014, offshoring to a supplier will not be the default.”
Rather it will be a hybrid of insourced and outsourced offshore development services. “Companies are starting to invest more in global business services models, [which combine] the best of shared services and outsourcing under a common governance model. This is seeing processes being offshored in captives by industries that have traditionally been reluctant, such as media and entertainment,” Vasithsha is quoted to say in the CIO article.
With a number of industries beginning to feel the crunch to catch up technology wise 2014 is likely to be full of initiatives in retail, insurance and manufacturing. Increasingly there is a need to do a better job of protecting customer data and both of these industries have been “patching up” aging systems for too long.
The recent security breaches at Target and Michaels were just the latest of a long line of breaches dating back to TJ Max several years ago. A combination of lax physical security and old firewall technology that had far too many holes in it have led to more retailers experiencing breaches by now well experienced data thieves.
And the thefts and types of retailers affected are not likely to end soon. Smaller and smaller firms are being attacked and many an analyst believes this is just the tip of the iceberg.
To catch up and get ahead of these criminals has more than one major retailer seeking new technologies to protect themselves. This has already driven firms in both retail and insurance to begin outsourcing at least a portion of their systems development and deployment to offshore firms. All in an effort to speed up the process of protecting themselves and their customers from credit and debit card data theft.
So despite the average investor’s fear of outsourcing technology development, the practical folks faced with real world issues don’t seem to agree. 2014 is going to be a busy year for development and the winning companies will be those who balance internal resources with external services successfully.
Now that the Salesforce1 has lifted off and some of the excitement is beginning to fall into more practical considerations, Salesforce.com is set to roll out a big upgrade to their industry leading CRM product. The Spring ’14 edition of the flagship CRM has been in preview with a general release set any day now.
According to PC World’s Chris Kanaracus this reflects how most of SF’s customers are still focused very much on the company’s core customer relationship management (CRM) application. Among the new features is a new mobile experience that has users being shifted to the Salesforce1 mobile application. Kanaracus explains, “The previously released Salesforce Touch is no longer available for download or use as an in-browser application, but Salesforce1 retains features such as support for Salesforce Communities while adding a refreshed user experience.
Chatter Mobile for Android and iOS have also been “upgraded” to Salesforce1. However, the BlackBerry version won’t get the same treatment. It is now a “connected app” that ties into Salesforce.com through an API.”
In addition to bumping up storage per user size to 2GB, a number of other feature enhancements and changes have been made. For a complete look at the 300+ page release notes go here.
Dell Services, the IT and business services arm of the long time server manufacturer, is seeking to help their partners and customers in developing for and migrating applications to the Salesforce Platform. They just rolled out a whole new set of services to support these efforts. This will give developers yet another way to create apps for Salesforce but the company thinks there is more in this deal for them than that.
Dell says that it will offer “expert advisory and application migration services” while also acting as a single point of contact for design, delivery and ongoing application management.
According to Raman Sapra, executive director and global head of Strategy and Business Innovation Services at Dell Services, in his prepared statement “This collaboration helps Dell customers develop new applications using the cloud, on the platform that works best for their business. Customers will benefit not only from our strong Salesforce advisory services but also from Dell’s own successful implementation of the Salesforce Platform. Offering customers this type of choice and flexibility is at the heart of Dell’s overarching cloud strategy,”.
Although it is a strong sign of the importance of application development and migration for Salesfoce some businesses may find Dell’s services too costly. For these businesses there are still a number of alternatives who already have long track records developing and deploying applications to the popular platform.
Looking to this kind of expertise keeps you from paying the cost of Dell’s learning curve in the space and in most cases companies with longer track records working with SF can be both faster, more successful and less expensive.
Nonetheless this is a significant upgrade to Salesforce and Dell’s interest in providing services specifically for this platform shows that outside expertise will continue to be a useful asset to IT departments and Sales end-users for many years to come.
According to SharePoint expert Christian Buckley whether you are considering updating your SharePoint platform or implementing one for the first time, your top consideration should be productivity. As he puts it in his piece for CMS Wire, “A common question from executives to the teams and stakeholders who own and manage SharePoint is; how productive are our end users in SharePoint?”
SharePoint is a collaborative platform, perhaps one of the very first, and it is perhaps the most mature. Nonetheless its primary function is to facilitate the work of a team and, as with most things, it is only as good as its weakest end-user.
To simplify the interface into SharePoint
- To better align end user activities with the needs of the business
- To better streamline business processes
- To get more out of SharePoint investments you’ve already made
The result of changing your focus to end user productivity means a higher return on investment (ROI) for the platform overall, because it means more users on the platform, getting more out of the platform.”
Now in order to maximize the productivity of your SharePoint system it may mean doing some tweaking, or as SharePoint expert Scott Robinson hopes, maybe Microsoft might just deliver those tweaks yet this year.
One in particular is the fact that he would like to see more Yammer in the mix. As you may recall Yammer was an enterprise social network platform Microsoft acquired recently and as Robinson points out it “was a leap forward for Microsoft, strategically sensible and timely. But today, what we have is only halfway there; with Yammer on the front end and the SharePoint content management system on the back end, we get an ungainly hybrid search result.”
“One of the huge advantages of SharePoint 2013 over its predecessors is the option to include Exchange servers in enterprise search. We want that when the search is initiated through Yammer, and we want it via direct access of Exchange from Yammer. Deepening that integration would be a win on several levels, encouraging greater use of Yammer, improving the quality of enterprise search and delivering a friendlier mobile experience.”
He adds that it is past time for Microsoft to incorporate “responsive design”, which adjusts the user experience to fit the device they are using, into SharePoint. This is a common theme among many older and legacy platforms as increasingly companies want to manage one site not multiple ones geared to specific devices like tablets or smart phones.
SharePoint, as all legacy software, is both aging and evolving to meet new business and technology challenges. How your company goes along with these changes is something that can get costly if you do not stay on top of the trends. For some companies seeking out firms that have solid knowledge of the systems and how best to get them to fit your business needs will be a best practice to follow.
One could argue that if there is a place for the much ballyhooed “Big Data” it lies within at least the systems purview of Business Intelligence (BI). A tedious process for years involving statistical analysis and weeding through tons of bad data to get to even a small grain of truth. IBM, Microsoft, Oracle, SAP and SAS still lead the pack of companies with systems deployed to maintain and weed this crop of data. For years these deployments were primarily in only larger companies.
However there is “gold in them there hills” for all and increasingly BI is starting to get more attention, particularly in the boardroom. C-Level folks are starting to want to better visualize their analytics and work with it more directly rather than via reports and interpretations by staff or third parties.
The really big hype over “big data” has subsided a bit and turned more into real interest in how best to harness even the data that is already available after years of operation. Increasingly small and mid-sized retailers are striving to catch up to the current state of BI in order to better anticipate their customer’s needs.
One thing seems sure, 2014 is going to be a busy year full of new opportunities to use both old and new BI systems to do more with all forms of data. Information Week’s “2014 Analytics, BI, and Information Management Survey” conducted in November 2013 gave some insights into what many enterprises are considering in the near future.
The report’s summary hits the highlights of the report generated from input by organizations using or planning to deploy data analytics, business intelligence or statistical analysis software. 67% of those surveyed indicated interest in employing advanced data analysis to advance their businesses.
Other points that rose to the top:
- 59% say data quality problems are the biggest barrier to successful analytics or BI initiatives.
- 35% have standardized on one or a few analytics and BI products deployed throughout the company.
- 44% say “predicting customer behavior” is the biggest factor driving interest in big data analysis.
- 47% list “expertise being scarce and expensive” as the primary concern about using big data software.
- 58% list “accessing relevant, timely, or reliable data” as their organization’s biggest impediment to success regarding information management.
The large enterprises that have already deployed BI and analytics systems will be facing challenges to first correct the existing weaknesses and then look to integrate these systems with cloud based applications to allow greater use of and direct interaction with critical management data. For the smaller business many new players and most of the older ones are already moving many features and connections to the “cloud”.
For obvious data security reasons adoption of cloud based BI systems hasn’t been the kind of rocket that more marketing oriented systems have. The larger companies keep this sort of data fairly close to the vest and locked down tight. However a recent research study conducted by Pringle & Company the cloud BI sector is growing fairly rapidly and should be worth close to $4 billion by 2017.
According to the study that Pringle conducted, the market for BI software and services in 2013 was $85.9bn, growing at a CAGR of 16.4%, leading to an almost doubling in size by 2017. “BI is one of the, if not the stellar performer of the IT market, having sustained high levels of growth during the recent challenging economic times,” Pringle said.
“BI has traditionally been focused in the hands of a relatively small number of users who, largely, work in enterprise-scale organisations. That has started to change with the availability of business-led BI, an evolution of the tools used to analyse data allowing a revolution in the scale of user community able to benefit from data-driven insight.”
The industries that will primarily be affected and should be considering what to do are retail, insurance, healthcare, and of course most consumer or business oriented service sectors. As more of their customers move to a mobile web centric universe the amount of data available to even small businesses will hold the secrets to success. Companies interested in this space should take a hard look at their current data management and then seek an objective outside opinion on what to do next.
For some of these businesses more traditional BI systems will be necessary to comply with regulations within their industries. Others will benefit from the lower costs and faster deployment that cloud based systems can provide. All will need to avoid turning a “blind eye” to the increasing need for advanced business analytics and how fast they integrate all of their data systems will be critical to survival in an increasingly tech driven world.
Still carefully balancing its existing product set, SAP is now pushing a bit harder to build its cloud business. Addressing primarily larger enterprise level organizations, the European software giant is increasingly building partnerships to deploy private cloud based versions of its software.
Enterprise buyers seem to be signalling a strong preference for private cloud deployments over public, particularly for key or sensitive workloads. Some are mixing in public aspects to their cloud only for less “mission-critical” applications like web apps, websites, collaboration, and content management platforms. Still they like clouds for the reduction in total cost of ownership (TCO), infrastructure flexibility and the shorter provisioning time-frames.
With Oracle, IBM and Microsoft already deeply engaged in providing enterprise level cloud systems and with cloud revenues growing rapidly, SAP certainly could not ignore the cloud. SAP has , according an analysis by Larry Dignan in ZDNet, got “big ambitions and what are likely to be a few challenges ahead. SAP is projecting €22 billion ($30 billion) in revenue in 2017 with €3.5 billion ($.7 billion) in cloud revenue in the same time frame. SAP won’t hit its 2015 35 percent profit margin goal and pushed it out until 2017.”
On January 21st SAP announced its plans to speed up its shift to providing cloud-based software services and reduce the emphasis on on-premise deployments. It has set its sights on growing the $4 billion it had in cloud business last year to roughly $4.7 billion over the next three years. So far its core licensing and maintenance business has suffered little thanks in large part to growth driven by its in-memory database HANA and still expects to see on-premise growth of 6%-8% in 2014.
SAP also is seeking to reach downward towards small to mid-sized enterprise customer (SMEs) and has entered into a partnership with NEC Corporation that allows NEC to provide SMEs in primarily the South East Asian markets with SAP’s Business ByDesign ERP product. This is their way of showing SMEs that SAP’s cloud solutions are not just for big enterprise.
For its part NEC intends to work closely with the software giant to provide unique localization capabilities to SAP’s cloud ERP solution.
“NEC has been providing SAP-based packaged solutions to a wide range of customers that are expanding their business presence globally,” said Yutaka Noguchi, general manager, enterprise solutions development division, NEC Corporation. “However, we also expect strong demand for solutions that are easy to implement for small to midsize businesses when they are seeking to expand into emerging markets. To tap into the growing business opportunities, NEC will work closely with SAP to integrate its cloud business expertise and assets with SAP Business ByDesign,”
The German software giant’s interest in the Asian market, particularly China, is clear. China alone, according to Forrester Research, is expected to spend $124.5 billion on IT in 2014. And with the rough equivalent of the entire U.S. population China’s mobile user base is perhaps the largest in the world. All underscoring the revenue opportunities for SAP. Yet the U.S. user base is still strong and eventually SMEs here will also begin to consider the value of SAP, or similar systems, in the cloud.
Making the decision on what to do will be hard enough for some SMEs and larger existing customers will also have much to wade through if they are to move towards the flexibility of the cloud. Particularly SMEs will be wise to consider consulting with SAP experts and services designed to sort through business needs and make clear recommendations on best practices for moving from an op-premise model to a cloud.
For existing deployments still happy with the “on-premise” approach but wanting to gain some of the mobile device or collaborative features, these same firms are capable of developing application specific solutions to provide you with those capabilities while maintaining your existing systems.
SAP’s move to the cloud should make this space even more competitive and could, long term, signal some significant growth in the size of the audience for these sophisticated ERP systems.
The adoption of mobile devices in healthcare has already started and early on was driven primarily by the continuing evolution of the “electronic health record”. With HIPAA and a number of other laws providing a framework and timetable for their adoption by physicians and hospitals alike, many have been slow to adopt with time already run out for some.
Particularly pads are increasingly being used by physicians and staff with record systems that allow patient data to be entered directly into the database and finally eliminating the “scribble” long associated with handwritten notes. This also gives medical staff quick access to other networked resources that can aid in patient care.
Now with the “final guidance” issued by the FDA last September the race is on to begin to develop “mHealth apps” for patient use on mobile devices. Although still many gray areas exist and the guidance covered a very narrow spectrum of what the FDA considered apps that “pose potential patient dangers”.
Now the Consumer Electronics Association is taking that guidance on the road to clarify the requirements for app developers and help them to identify best practices being used by those who are already building regulated applications. Entitled the “Mobile Medical Apps (MMA) Roadshow: Managing App Development under FDA Regulation” the educational series is being held at multiple U.S. universities.
“There are huge opportunities for healthcare providers to monitor and diagnose remotely, and for individuals to take a major role in their own healthcare with fewer trips to the doctor,” said Gary Shapiro, president and CEO, CEA. “Consumers are increasingly using electronics to manage their health, and we are thrilled to support a program that aims to spur innovation by bringing more mobile apps into the healthcare space.”
Unfortunately there are some that are concerned with how complex the regulations already are and that it could squeeze smaller competitors out of the space. This could result in few useful applications and less overall innovation. Some of the problem is getting through the FDA review and with new legislation being introduced by Congress that would provide further guidance to the FDA concerning mobile health applications, it may get a bit more confusing further down the road.
Considering that this is a trillion dollar industry and that half of us already own smart phones and other “smart” devices, there is much at stake. Applications to provide better patient outcomes and pro-active medical activities are all on the horizon. Yet using the Internet to connect and control implants, or facilitate the transmission of medical data from patient to physician, or even providing self-help devices to monitor everything from weight to blood pressure; all would require up-time rules and privacy protections.
If your organization is considering developing mHealth applications partnering with experts in both development and the rapidly changing rules and regulations will mean the difference between success and failure. Although complicated even smaller start-ups and mid-sized existing medical businesses could bring both helpful and lucrative applications to market.
To be sure the MMA roadshow is the first place to start and there are still several schedule through the end of 2014. After that it is likely that MHealth application development itself will be well on its way to adding new devices to keep us fit.
Proximity marketing as a concept is not new. It is often described first in terms of the type of device it can address in the hands of a potential customer. Essentially the concept has always been sort of a “broadcasting system” that addresses mobile devices.
Wikipedia speaks of it as the “location” of a device like a particular type of cell, a Bluetooth or WiFi device, Internet devices with GPS, or a near field communication (NFC) device that reads radio-frequency identification devices (RFID) placed on shelf tags. These are used to launch location specific content from either local or Internet servers.
The RFID method has limitations when thinking about using it beyond scanning the tag itself, as it has a range of only a few inches. However a new version of Bluetooth, Bluetooth Low Energy (BLE) can send signals up to 150 feet. A subtle addition to Apple’s IPhone’s features called IBeacon is an “indoor positioning system” that uses BLE and can be used by proximity marketers to push information out to IPhones. Android devices have similar capabilities as they also support BLE and with both devices the additional drain on the customers battery is far lower with BLE than it is with NFC equipped devices.
Calling their proximity system a “Micro-targeted in-store mobile marketing platform”, and currently under patent review, Swirl Networks, Inc. is banking on the fact that “brick and mortar” sales still leads, with 90% of retail sales still taking place there, and the fact that over 50% of U.S. adults are now carrying a smart phone.
As a growing majority of folks are starting almost every purchase decision in front of a search engine, it is no wonder that eventually this will go beyond just limited broadcast range applications to provide the ability to interact directly with the customer even when they are searching competition within your store. These “in-store” platforms have the potential of providing customer service tools, in-store searches for both help and products, even ongoing alerts broadcast directly to the phone with special sales offers or other enticements.
Now enters the newest range of low power processors that can be combined with BLE chips in small devices to do very powerful things both locally and in interactions with the Internet. This is where the “Internet of Things” comes in.
Now the possibility exists that we will soon see these deployed in retail environments and allow for not only a direct interaction with a shopping customer but to keep a running data set on what they consider, buy or pass over. Imagine seeing a price change on the shelf just for you simply because you have their application enabled on your smart phone.
Swirl is even offering a mobile client software development kit (SDK) that enables retailers who use their system to build their own mobile applications with Swirl’s capabilities embedded in them. This provides for “including the ability to engage shoppers with highly targeted and personalized content based on indoor micro-location as determined by the SecureCast beacons. The same in-store mobile experiences can also be added to any third-party publisher’s app to expand audience reach and drive additional foot traffic to retail stores”, according to the SDK’s release notes.
Retailers of all sizes and product types will need to consider if these tools and a more mobile audience may be their way to hold back the advance of Internet based retailers into their customer base. Developing within systems like this does not have to be overly expensive and can be easily within the reach of even smaller retailers. If they are to take advantage of these new technologies and the growth of mobile Internet use, seeking outside expertise will be worth the investment.
At the end the important thing to note is mobility has brought folks back out and into stores. Now these brick and mortar operations have to use that mobility and the power of the Internet to win back their local audience.
First the bad news. Recently Microsoft changed its licensing model for the Enterprise edition of SQL Server 2012 to a “per-core” basis. They have grandfathered users who deployed under the old model, which was per-processor licensing, but new customers will will have to subscribe the new way.
According to licensing expert Paul DeGroot there are a few things you will need to look out for in the new model. Pricing has remained the same with the SQL Server 2012 editions, for the most part. However, he adds that there are some “traps” that could significantly increase costs for organizations, particularly at the license renewal stage.
For instance, the core conversions from SQL Server 2008 to SQL Server 2012 can be incorrectly assessed. An organization might assume four cores per licensed processor, but the actual core counts could turn out to be higher at license renewal. He cited an example of a reseller that under-licensed in that way, resulting in an extra “true-up” expense for a customer of $290,000.
DeGroot also urges customers to not take Microsoft’s core counts without scrutiny either. The important take away is that organizations should conduct thier own survey of equipment and be sure that they are using accurate counts both when initiating the license and at renewal. The new licensing scheme will continue in the next generation SQL Server 2014 which is already deploying trials and should hit full steam later this year.
SQL 2014 has already revealed several enhancements that should excite fans of the server OS:
Encrypting backups gets simpler in SQL Server 2014. Now you can easily prevent a stolen backup file from being restored on any other system.
SQL Server 2014 includes business intelligence (BI) improvements to help build and support vast databases and data warehouses.
In-Memory OLTP Engine
In-Memory OLTP significantly improves the OLTP workload and reduces the time required for processing. The performance gains depend on the hardware infrastructure you have and how you are writing your queries.
Memory Optimization Advisor
The SQL Server 2014 Memory Optimization Advisor will help you quickly analyze your tables and walks you through reviewing and migrating disk-based table to In-Memory OLTP tables.
Windows Azure Integration
SQL Server 2014 includes the ability to use Windows Azure for offsite backup storage and as a disaster recovery site. With SQL Server 2014 you can backup and restore your databases from copies stored in Windows Azure, which can quickly and easily provide off-site storage for all backups.
With many organizations still well invested in SQL Server most will want to stay in step with both the changes in licensing as well as the new features and improvements that SQL Server 2014 brings. Consider the services of an expert to aid you in puzzling through the changes and the new prowess this version will introduce. There are a number of options for gaining this expertise and organizations exist that will aid you greatly in moving your server infrastructure forward.
In a world now very dependent on technology and marching rapidly into a future where everything will be connected, your server OS may well become one of the most critical investments you maintain.
If you listen to Forbes contributor Gene Marks, Microsoft is about to put Zendesk, Autotask and Freshdesk out of business. As he puts it, “…I’m probably exaggerating when I Say that Microsoft is going to put them out of business. But at the very least, they’re about to face a huge competitive challenge.”
As Mr. Marks company is a Microsoft partner and sells the Dynamics CRM product he says with all deference to the other providers of customer service applications that “he couldn’t be happier”. The reason he is so happy is because Microsoft just spent $100 million to acquire Parature, yet another highly respected provider of help and service management tools.
The acquisition and eventual integration of the Parature products into Dynamics CRM to further expand its capabilities beyond just marketing and sales. Like industry competitor Salesforce, adding customer service aspects simply increases the value of the entire system, particularly in the shared data that an integrated system containing marketing, sales and service will produce.
Microsoft also didn’t forget to strengthen Dynamics usefulness with its other cloud products and built Dynamics compatibility into its Power BI suite. Although not greeted with as much attention as it deserved this comparability allows Power BI to leverage Dynamics data that might be missed. This essentially allows you better ways to visualize vast amounts of opportunity data adding dramatically to your business intelligence and market planning.
Although still trailing both Salesforce and Oracle CRM products, Dynamics is growing an audience fast and adding more to its entire ecosystem via its very robust cloud offerings. Still choosing the right customer relationship management system for your business is no small undertaking. For some businesses Dynamics may be an option, as are both Salesforce and Oracle.
These are critical tools in an age where your data and how you use it can make or break your business. Understanding your options and developing the right systems to meet your business challenges sometimes requires seeking outside advice from firms who understand the different options at a deep level.
These firms can often save you both money and time by bringing to the table years of experience in deploying and customizing all of the options in use. Even if you have an in-house team sometimes it just makes sense to allow them to focus on your business as an outsourced team examines, customizes and deploys your ultimate solution.
With the explosion of information available and being collected by systems in use by sales and marketing Microsoft is simply following the logic of the marketplace and recognizing that its end-users benefit greatly from the integration of these systems and data.
Although originally written as a message board Drupal is easily one of the first open source content management systems. Like many software development projects, the original aspiration was to be a commercial boxed product. However creator Dries Buytaert allowed it to become an “open source” project in 2001 and it was then it began to find an audience. Two years later it got a boost when Howard Dean, then a U.S. Presidential hopeful, and his campaign team employed Drupal to create “DeanSpace” one of the first integral components of a national political campaign.
Although his bid was largely unsuccessful politically, the team that created his website continued to pursue the development of Drupal as a political web platform through the creation of CivicSpace Labs in 2004. Drupal itself continued as an open source community and it has been growing in use and capability ever since.
Not one of the top two in use it still maintains a strong user base of what Drupal describes as a both a content management framework and justifiably as a web application framework as well. On January 14th Drupal 7.25, a maintenance release, provided for a handful of bug fixes and minor performance enhancements. To see a list of the major changes see Mike Johnston’s CMS Critic review of the release here.
However that was just a shot off the bow of Drupal developers who, although loyal, have for years complained that the platform is not always developer friendly. With that in mind and working for over a year, Drupal founder Buytaert wants to see it put to bed and release the much anticipated Drupal Version 8 by the middle of 2014.
In an interview with Computer World Australia Dries stated “We’ve been saying ‘it’s ready when it’s ready’, so what that means for us is when there are no critical bugs left. I track the number of incoming critical bugs versus the number of outgoing critical bugs. Basically how many new critical bugs are reported versus how many we fixed — and the number’s pretty steady, meaning we do a good job fixing them but there’s still some bugs coming in as people download the alphas and try things.
“They try to upgrade a module, for example. And sometimes it’s not just bugs but also when people try to implement against one of the new APIs. Sometimes they’ll say ‘What the hell is this?’ or ‘It could be made easier this way.'”
Drupal 8 will be the culmination of a comprehensive re-design of Drupal’s core and is reported to be finally correcting Drupal’s long standing lack of commitment to backward compatibility. This lack of backward compatibility has always hampered versioning of the software, limiting its ability to update frequently to better meet developer needs.
Fixing that will likely throw a bit of a wrench into current Drupal developer’s machine until they adapt to the new platform. “We’ve brought Drupal more in line with modern projects,” Buytaert said. This means work needs to be done to ensure that developers who are used to previous version of Drupal have help adjusting to D8. “It’s going to be a really great release, but there are a lot of changes and some people will have to relearn Drupal,” Buytaert said.
“It’s not the Drupal that they used to know, so that’s a challenge. A lot of people absolutely love that and we have a lot of validation around ‘Yep we’re doing the right thing for Drupal’. It’s going to attract many more people to Drupal and we’ve already started to see that.”
Drupalcon 2014 is being held in Austin, Texas this year June 2nd through the 6th and tickets are already on sale. Considering that Buytaert is shooting for mid-year to release version 8 it just might be ready by then. As adopting the new and improved platform will require a great deal of development work and learning curve many firms may find themselves short handed if they want to advance to the new version. Fortunately there are plenty of options for supplementing your internal development staff or outsourcing the move to a team of developers already prepared to embrace the new, more advanced Drupal framework.
With over one million Drupal sites already in operation and a growing list of available development help, there are many who believe that Drupal will soon join the current top two content management platforms, WordPress and Joomla, in user adoption. This increase in adoption will be the result of the Drupal team making development easier while still maintaining the core features and security that have made it a favorite with its users for over 13 years.
Characterized by comprehension of the parts of something as intimately interconnected and explicable only by reference to the whole.
Let’s get something straight from the start, I have both owned and worked with ad agencies throughout my career and there are still many that have some the best “idea factories” money can buy. The problem is that the advertising agency model itself or the use of “ad” as the basic foundation for marketing is now only one of a hundred components in a well structured marketing effort.
For marketing in the here and now a more “holistic” approach is required, one that brings together multiple functions and services, treating them more like individual apps in a single marketing cloud. The difference is that these functions still require human creativity and innovation to drive their development, integration within the greater strategy and deployment to a coordinated set of predetermined engagement channels.
Just as your clients should be wanting to “engage” their customers at a greater depth so should the firms who aid them with that engagement approach their clients as potential partners, not just an opportunity to “pitch” a campaign. There are just too many channels now for the more simplistic brand and advertising strategies of the past to work.
And don’t think for a minute that the traditional channels are completely dead or ever will be. With all the talk of TV ads being dead, infomercials are a bigger business than ever. Considering you still run video ads when folks view content online, and the numbers watching that way are also growing daily, the “television” commercial is still with us just running on a few more channels. Billboards have gone digital and radio is still showing positive ROI in many specific local applications. Print at street level is indeed week but there has been a marked rise in print ad ROI in niche print publications that appeal primarily to higher income customers. The rest of the print outlets have simply stopped killing trees and gone digital themselves.
All of the rest are in the new tech and social channels with mobile devices factoring heavily in how you choose to build your web and to drive more direct mobile based customer engagements (mobile apps for example).
Long and short term strategies should still be built on the back of comprehensive data and experience based knowledge of target audiences. The art of demographics is still in there but it comes in variations that are more detailed and sufficiently nuanced to facilitate a one-to-one understanding of each customer not just large, generalized groups. You could think of it as a self-sorting segmentation tool that has its own ability to learn and refine the knowledge down to the individual customer’s level. This is over time so its learning curve has to be accounted for in the long term strategy and forgiven in the short term measurements.
The data derived from these exercises has to have equally near real-time analysis and tactics in place to act on refined customer data in near real-time. This leads to the need for marketing automation and more direct marketing ties to sales and sales mechanisms. In other words it all has to work in concert not just as individual parts with “hand-offs” in between.
Hence the rise of the “Holistic Marketing Practice” (HMP). Part old style creatives, part tech crazy geeks, part stiff collared number crunching data miners, part nurturer, part closer, and all one cohesive effort with as few “generals” as possible. A marketing and sales machine that can be “characterized by comprehension of its parts as intimately interconnected and explicable only by reference to the whole.”
BTU is such a practice and we partner with you in efforts that go well beyond a simple ad agency scope and approach. Big or small. Long term or just a helping hand. We’d love to hear your story and see if we can make a difference. Give us a call when you’re ready.
It is still hard to make a sweeping prediction or even a sweeping commendation about whether it is better to have a natively developed mobile application or one that is a browser-based cross-platform. The later is facilitated by the use of HTML5 and CSS3’s inherent ability to scale to any device. The main reason that HTML5 is growing again as the best long term solution for mobile is that the concept of a “mobile web” is going away. Replacing the “mobi” web concept is being driven by the expectation of end users that whatever you deploy, whether it is website or application, will run on whatever device they have decided to use.
The browser providers have preferred HTML5 for years and they all, including Microsoft’s Explorer, are building in greater and greater support for HTML5. Built-in device interaction, native video/audio without the need for a player, the mobile browser can support whatever the user needs. These factors eventually will, according to ReadWrite’s mobile editor Dan Rowinski, cause “the mobile web to die in 2014”.
He is fairly blunt about his dislike for the early attempts at mobile web. “In 2014, the mobile Web will die. That’s right, that bastardized version of the normal Web will crawl into a shallow grave and leave us all in peace. No more websites crippled with horrible “mobile.yourawfulwebsite.com” URLs. No more reading janky websites that display way too much fine print or omit crucial features when viewed on your smartphone or tablet.”
The move towards HTML5 is not just a mobile issue either. There are a number of desktop applications and games that are now being built with HTML5 and Mozilla has already introduced its Firefox OS which is a browser-based mobile operating system also built on the principles of HTML5. YouTube rival Vimeo has completely redesigned its video player using HTML5. Brad Dougherty, a member of the Vimeo design team, provided a little background on what has changed to drive them to a pure HTML5 player:
- Browser innovation has brought new HTML5 capabilities (full-screen viewing is now available on every major desktop browser).
- Smartphones have gotten more powerful (and in many cases, bigger), and the variety of smartphones has increased tremendously (three years ago, when we debuted the HTML player, there were only a handful in existence.)
- Firefox added support for H.264 on mobile, Windows, and Linux (with OS X support on the horizon).
- The introduction of devices that support multiple kinds of inputs (e.g., touch, mouse, and pen) at the same time.
“With all these advancements, it was clear that we needed a more flexible and accommodating base for our player,” Dougherty wrote. “So we did the only thing that made sense: We rebuilt the whole thing from scratch.” And the bottom-line for users is if they have a browser that supports HTML5 that will automatically be the player used. Thanks to the use of HTML5 this means video loads faster and provides for additional features not easily built into Flash or Silverlight coded players.
And finally chip giant Intel is pushing its latest HTML5 XDK into the mainstream development community stating the goal to help HTML5 “reach its promise of a true cross-platform, responsive, run-anywhere language and run-time, and which is based on standards”.
The continued growth of HTML5 as both application platform and in web design has been slow but sure. There will still be room for more native approaches for awhile and the certain death of the mobile specific web may not happen as quickly as predicted but it is something you will need to consider when updating or building your web or next app. Expertise in both making that decision is available and many firms are excellent alternatives to more expensive in-house development or as a supplement to your existing development community.
Either way, it seems that 2014 may well be the year that sees HTML5 and its counterpart CSS3 take both the mobile and mainstream web to blend them into experiences that fly on whatever device end-users choose.
Small and mid-sized businesses are faced with increasing competition from large enterprises who are reaching their customers using mobile marketing, proximity GEO driven offers, and sometimes taking the business away when the customer is in their store via their smart phone.
In a report conducted by Nielsen for Google entitled “Mobile Path to Purchase – Five Key Findings” Nielsen found that 55% of consumers using mobile to research a purchase, wanted to buy it within the hour and 83% wanted to make that purchase with in a day.
The report also noted that mobile research begins with the search engine, not with a branded mobile site or app. On average, the report says, consumers spend over 15 hours per week researching purchases on their smartphone and visit websites via mobile six times during the same time-frame. For 69% of these mobile buyers location proximity was important and most wanted the location to be five miles or less from them.
Perhaps the most encouraging statistic, if you have done all you can to maintain solid SEO practices and are using responsive website development, is that 93% of people who use mobile to research a purchase of either a product or a service go on to complete a purchase. Even better, most of these purchases happen in a physical store.
For SMBs this means thinking mobile first may well be the way to boost their success in the “brick and mortar” world. Even small storefront businesses will need to have mobile friendly websites and using everything from simple SMS offers to introducing coupon applications that can be used on a smart phone either via a scan or pushed to the customer based on purchasing behavior data, when they are in the store. They will need to make sure that your websites can be found easily, they are simpler and load fast, and because the stats favor purchases within a day and within five miles, you should be sure your physical address, hours of operation and contact info are right up front.
Although many have thought that the web would completely replace small business and that “big box” stores would also spell their doom; mobile may well be the technology that gives SMBs a new edge and a way to better market to the customers they need to grow and prosper. These companies don’t need to have a huge staff of programmers to meet these new challenges and have access to the kind of development that produces mobile-friendly webs and better ways to draw you local customers in. There are a number of firms that provide both expert advice on options and low costs for developing your mobile marketing solution.
It’s not a matter of if you need to take advantage of mobile marketing in 2014, it’s a matter of how much longer can you wait. Firming up your web and mobile strategies to implement new ways to sell your product or service now is critical. Mobile has already put a dent in the big box and web store sales by providing customers with even more immediate fulfillment of their needs. Don’t let your company fall behind and take full advantage of what mobile applications can bring to your business bottom-line.
Just a few years ago Salesforce was banking on the “social enterprise” to be the pitch that would drive their growth. Now even they admit that did not resonate with potential customers and that performance, a more comprehensive platform and the ability to develop better custom applications to be the long term key to furthering its mission to dominate the marketing and sales cloud industry.
Their solution was introduced late last year and according to early data, Salesforce 1 is already in an application explosion. Since its introduction Salesforce representatives indicated that there was a 96 percent increase in SF1 mobile application users and a 46 percent increase in custom mobile app use in just the first month of the new platform’s operation. Not a definitive win but certainly a strong indication of a positive future.
In his recent analysis of the launch, Larry Dignan, Editor in Chief of ZDNet/Smartplanet, explained the strategy this way “Salesforce1 is the company’s effort to future proof enterprises social, cloud computing, mobile and the Internet of things all blend together to create new touch points for customers.” Custom application development is part of the strength of this approach particularly when you speak of the “Internet of Things”.
In their roll-out of the new platform they emphasized that the platform allow you to build “any type of app”. They envision these apps equally running on any type of device, even those that may not yet be in use. Here’s a short list of the additional app support moves:
- There will be 10x more APIs and services
- ISVs can build, market and sell next gen apps on the AppExchange market
- Current Salesforce apps will run within Salesforce 1
- There’s a Salesforce 1 Admin App for CRM administrators
Salesforce 1 will have an iOS and Android mobile app that will aggregate all Salesforce tools including custom applications run by a company.
Salesforce still needs to expand and enhance analytics supported within the platform but as they add the Internet of Things it may be another way to push further into the development of better analytic tools. As more apps are developed that integrate more directly with the Salesforce cloud the more data about customers, their purchases, geographic information, even the ability to use that “on the fly” to generate “real time” interaction with those customers via mobile devices.
The reason for this effort is easy to find as recent research indicates that two thirds of marketers surveyed worldwide plan to increase their investments in digital marketing in 2014. The race to provide the most robust integration of “best of the best” marketing and sales tools into a single service has a very big carrot considering this is already a multi-billion dollar industry.
As more small and mid-sized businesses see competition embrace these tools they should be considering if there are benefits to enhancing their marketing and sales operations by adopting SF or enhancing their current use of it. Increasingly they will be needing to differentiate themselves from competition by developing their own applications both for customer engagement and data acquisition. Yet for many of these companies custom application development remains too costly or their internal staff is not able to take on the additional workload.
For these companies looking to an outsourced development solution will be a way to keep up without “breaking the bank”. There are many who have years of experience with Salesforce, as well as, with many other CRM solutions and clouds. These folks are the ones to turn to for solid ideas and solutions that are based on your business needs and are able to provide the development at costs far lower than either smaller local providers or in-house teams.
Although Salesforce is only one a many clouds available, with many more already in the pipeline looking to build a “better mousetrap”, the Salesforce 1 strategy already seems strong enough to increase SF’s hold on a major portion of the market. If they can win their race to providing better analytics and we are likely to see Oracle, Microsoft, and a number of others also building in more comprehensive application support.
One thing is for sure. If businesses are going to succeed in the future, most will need to have solid strategies to utilize these super CRM tools in place now to meet the challenges of the very active mobile application development year 2014 is expected to be.
Although there has been growth in the business intelligence market, 2013 is described by BI Scorecard Founder Cindi Howson as not the best, that was in the mid-90s when the BI market was growing at 40% per year. She however goes on to say it was not the worse either, for that was 2008 when it limped along at 2% a year.
It has been a varied year with the giants only growing a little and the “upstarts” doing much better. As she writes in her Information Week commentary Howson points to “A look at vendor revenues and product releases in 2013 reveals a tale of two worlds: the large BI-platform vendors and the nimbler visual-data-discovery vendors. For the most part, large BI vendors showed flat or low-single-digit revenue growth while companies such as Tableau (75%), TIBCO Spotfire (30%), and QlikTech (23%) have all shown strong, double-digit growth through the first three quarters of 2013.”
Why are the smaller vendors growing so fast? Howson believes it’s “agility”. “In the now-normal frenetic pace of business, users can no longer wait for that perfectly architected, IT-sanctioned reports or carefully modeled business queries. They need to mash together new and broader data sources, whether from the cloud, a partner, a supplier, or Twitter.”
Information Week Executive Editor Doug Henschen takes this one step further and the 2014 Information Week Analytics, Business Intelligence, and Information Management Survey he says clearly favors vendors who have focused on “ease of use”.
He states that, when asking what is getting the most attention, “The biggest gainer in current or planned use compared to last year’s results was Tableau Software while the biggest slides were seen by Actuate, IBM Cognos, and MicroStrategy.” He adds, “As in past years, survey respondents were qualified as being involved in determining the need, recommending, specifying, or authorizing or approving the purchase of data analytics, business intelligence, or statistical analysis software.”
Of the 248 qualified respondents, 13% say they’re using and 6% say they’re planning to use Tableau Software (see chart below), a vendor known for data-visualization software that has a reputation for being easy to use. That’s a 5-point overall increase from our 2013 survey, in which 8% of 417 respondents said they were using and 4% said they were planning to use Tableau.
A great deal of the renewed interest in BI is due in part to the advent of “Big Data”, although the term simply describes the now vast amount of data that continues to be uploaded to the Internet. When companies are looking at even the data on hand they are understanding that the ability to better analyse that data has not become a critical mission. Because there is so much data it is important that the effort to “mine it” is very focused on specific business or business intelligence needs. BI is most successful with solid database design, an understanding of business analysis needs (forecasting, marketing, new product development etc.), and a very precise short and long term implementation plan that ensures your BI results add value to your organization.
Many companies find themselves unable to afford or in some cases simply do not have access to experts that adequately cover all of the knowledge to plan, develop, and implement these platforms. To meet these challenges it is wise to turn to companies who provide this level of expertise and aid you in the many tasks you’ll need to perform to implement or upgrade a BI platform. Outsourcing like this allows companies to simply supplement their in-house teams long enough to cost effectively deploy BI systems while not being “tied down” with personnel who may not be needed after deployment.
2014 is going to be a pivotal year in the business intelligence market and likely for their customers. The rebirth of business intelligence as a key company asset means, revisiting yours should be on your top ten “to do list”.
If you speak with “old school” networking folks they will tell you that Larry Ellison’s interest in “the cloud” is because he has been involved in building it since he built his first database (working for another company) in the 70’s for the CIA. He has been fighting to make sure that Oracle, which was also the name he gave that first database over 40 years ago, leads the data industry…everywhere.
To say that he has Oracle in “a war for the cloud” may well be an understatement. He has fought them before in the early 90’s when Oracle came close to bankruptcy and then had to win back marketshare from early competitors like Sybase. He has continued to guide Oracle towards database market dominance ever since and now has his sights on both Salesforce and Adobe with Oracle’s acquisition of cross-channel marketing firm Responsys for $1.5 billion.
Forrester analyst Rob Brosnan states that “This absolutely is a response to Salesforce. This is the battleground for the next ten years.” He is speaking of the recent acquisitions by Salesforce of Radian6, ExactTarget, Buddy Media, and cloud platforms like Heroku. All Things D’s web has pegged Saleforce as potentially growing to be a $10 billion company by 2019.
Responsys joins other billion dollar acquisitions Eloqua, Taleo and RightNow with the company signaling these are just for starters. According to Forbes this means that Oracle is declaring war on the Salesforce.com and Adobe. Salesforce.com unified platform, the result of SFs purchase of ExactTarget and Adobe’s recognition that they needed to get into the marketing space.
Between the two SF and Adobe platforms they are generating $30 billion dollars and are the ones to beat in the marketing space. Only time will tell if the Oracle acquisitions will lead to a real “head to head” battle but they do present both opportunities and further puzzles to solve for end-users of their products.
Smaller and mid-sized companies will find many new options to enhance their ability to compete with larger competitors using Oracle or SF but they will need increasingly sophisticated knowledge to truly gain value. These companies would benefit from using outside experts that are available in multiple companies dealing with development and deployment of marketing systems. Many times these experts can save you budget dollars by ensuring that you will actually benefit from a cloud based solution or if you are able to develop alternatives based on existing systems you have already made investments in.
Either way, the war for the cloud is on and Ellison continues to guide one of the most formidable competitors in the new “cloud technology age”.
Rolled out last July at Microsoft’s Worldwide Partner Conference in Houston, Power BI for
Office 365 was met with a great deal of excitement. Released in August Power BI brings together a number of data modeling and visualization capabilities available via Excel.
All of this was aimed at providing users of Microsoft’s cloud-based services with a more robust “self-service” model to augment on-premise BI implementations when using them “off-premise”.
Below is a breakdown of the initial key components that Power BI brings to Office 365:
In November MS provided a significant upgrade to further strengthen Power BI’s self-services features:
The newest updates to the system were announced in mid-December and further improve several of the self-service features. Power Query can now connect to the Windows Azure Table storage service and adds the ability to use SQL queries to perform relational data imports. Q&A now supports adding Excel workbooks to your Power BI site.
Although the the final product has yet to be rolled out the currently deployed preview version has been on a steady streak of updates since September. Microsoft remains quiet about an expected final release date and making the improved platform available, it will add considerably to the value of the service both on-premise and off.
As you prepare for the new year and begin to evaluate your business intelligence options, outside expertise may be useful in determining your best options. Many development firms who focus on supporting the vast number of business intelligence platforms are an excellent choice for supplementing your own expertise.
These firms can evaluate your current systems, your current business needs and provide input on how to get the biggest bang for your budget dollars. The very fact that most of them have been working with cloud BI and on-premise applications gives them a long track record on adapting these to fully take advantage of new features available in this robust platform.
Whether Office 365 is your answer or if a more robust solution is a better fit, using the current set of BI tools now available more effectively may mean the difference between business success or failure. Truly “knowledge is king”, and at some point your business will need to employ sophisticated business intelligence tools to keep up with your competition.
There are a number of predictions about the evolution of mobile applications and what design principles will guide them. One that is getting a lot of attention lately is the addition of voice to individual applications rather than just as a “virtual assistant” (VA). Up until now most of the attention in app use has been on the swipe but as the technology evolves more are considering giving their applications a voice and letting you control them with yours.
Amazon’s support of Apple’s VoiceOver reading and navigation feature for blind and visually challenged users allows e-books to automatically read the words on the page. They also acquired IVONA Software providing the Kindle with text to speech and other voice activated features.
Of course more and more companies are building unique versions of the popular IPhone VA SIRI, some designing them with a specific business focus or need. But “voice-controlled computing” is gaining steam among business end-users with a Forrester Research report in July reporting that 37% of information workers are now using some form of voice control.
They are not just using the assistant features either, although that is a big part of it for professionals on the go. 56% of those polled indicated they used voice recognition to send texts, 46% used it for search and 40% to seek directions. Some 38% used their devices for recording notes, much like they would have use a tape recorder in the dark ages. Soon those notes will be easily translated to text via voice recognition and available without having to transcribe them via a keyboard.
Don’t think that PCs are being left out either as chip maker Intel is working on a new chip for the specific purpose of supporting voice enabled software applications. They plan to begin putting them in computers planned for release in 2014. For Intel this is as much aimed at increasing market penetration into countries with lower literacy rates but increasing technical needs, hopefully driving up PC and hybrid PC/Tablet use in these countries.
Intel launched its fourth generation Core processors earlier this year, which according to the chip maker deliver 50 percent improvement in battery life, translating to over nine hours in some systems, and also enable a range of 2-in-1 convertible devices that can act as both a tablet and a PC.
Commenting on the project Intel Technology India’s South Asia Director of Marketing and Market Development stated, “We will continue to move full steam ahead into the tablet and 2-in-1 space. Touch it, type on it, or talk to it – these devices are multitasking powerhouses that will offer us new interaction possibilities”.
Unlike traditional software development the focus will be on developing thousands of apps along the way not just single pieces of software that do multiple tasks. Mobile apps tend to be single purpose and more are being developed with business productivity in mind. Understanding both the context of use and whether or not voice recognition or control features will provide greater adoption or productivity will help you to decide what application development route you must take.
Your best bet will be to confer with experts in the field of application development, both for desktop and mobile use. Each platform, desktop or mobile, has its own context and top-notch developers will provide you with insights on how to develop your applications to fit both the mobile and desktop environments. As the reality of building more apps and servicing more devices becomes critical for your organization’s long term success it always pays to gain expert insight.
Nonetheless, in 2014 SIRI is going to have a lot of company as an increasing number of apps are given a voice.
The debate over application design for mobile and whether it should be natively built, HTML5 or a hybrid continues, it seems to have resulted in a “draw”. In a recent survey conducted by Kendo UI of 3,500 developers (you can get a copy here) they, despite their in-mass prediction that HTML5 would be the winner in the same survey just last year, are now indicating that they are doing what makes sense in each individual circumstance, not just going to HTML5.
Last year 94% of the same folks were certain that HTML5 would be taking over and had already begun coding primarily HTML5 apps. According to Todd Anglin, EVP of Telerik who makes cross-platform developmentt tools, “Developers … are quickly realizing that there is no ‘one-size-fits-all’ solution for their mobile development process. The choice between native and hybrid approaches is dependent on business needs, app requirements, developer skill, development timeline, and other factors.”
Famo.us believes it has the solution for gaining all that HTML5 promised without the performance issues that make it less attractive to developers, particularly those working on devices that won’t run Flash. Currently they have 7,000 developers signed up for the beta of the new framework and a quick visit to their demo site will give developers more than enough reason to check out this new approach.
Gizmox, a provider of HTML5 platforms and development tools, recently made Visual WebGui Version 7 available as part of its effort to aid developers in getting more out of HTML5. Itzik Spitzen, Gizmox CTO, promotes Visual WebGui’s ability to compete with native development saying, “The mobile features available in Visual WebGui 7 along with Visual Studio 2013 integration make an exciting combination for development teams creating and deploying data-driven business applications with native-quality user experiences.”
So how do you decide, if your a business that is considering developing one or more mobile apps, which approach to take?
Best to consult with an expert that can analyse your business and how you or your users can best be served. With the development world changing and evolving each day it is hard for internal, more focused, IT teams to keep up to date and making the wrong choice could cost you time, money or even customers. Reaching out to a development company with experience in all the options and a well thought out methodology for evaluating your business and business processes is the best way to avoid mistakes.
Either way, apps are expected to surge in numbers as mobile device use also surges worldwide. Businesses are quickly discovering that apps can and do improve thier bottom-lines if they are designed correctly and “future proofed” by using as flexible an implementation as possible. Careful examination of your audience’s use of devices, how they receive and react to information and what is in the best interest of your businesses bottom-line.
The first part of a Wired Magazine’s article on the subject push v.s. pull in mobile app design, written by IBM WebSphere Chief Technology Officer Jerry Cuomo and Program Director Robert Vila, compels us to consider carefully their arguments for favoring a push strategy for mobile engagement. In part one the authors focus on why, unlike a traditional desktop web approach that tends to be driven by the user (pull), the nature of mobile makes it more efficient to push essential updates and messaging to the device.
In the case they present they use a financial transaction application to make their argument and to provide an example of why, in this case anyway, it is better to just deliver the action rather than waiting for users to request it. Although they acknowledge that in many cases “it depends” on the client, desired outcomes and target users yet when you compare the traditional “pull” oriented three-tier web model to one that “pushes” to mobile you can see why pushing makes a lot of sense.
“The familiar three-tier web design pattern, which is driven by user-initiated requests for content, does not always fit the mobile world. The sheer volume of mobile users and increasing ubiquity of mobile devices can lead to exponentially higher numbers of requests that can strain network and server resources to the limit,” write Cuomo and Vila.
However, they look at the mobile push model as overcoming extra load that the three-tier model creates in the mobile space, “The mobile push pattern, in contrast, sends content to devices automatically, eliminating the dependency on web app servers for pulling content updates. Instead of users continuously pulling or querying information on the odd chance that it has changed, updates are sent proactively using lightweight,integrated messaging services that can be scaled to meet rapidly increasing mobile demand.”
To these authors it is about balancing the load created by less important transactions, like checking your bank balance on your mobile, that users tend to overuse simply because they can. When you push such updates out to the device and allow it to sit in a queue then it is available to the user without placing more strain on the entire system. Think of it like “load balancing” in a data center environment and you will understand how it can be more efficient.
The graphic from IBM below illustrates very well the difference taking this approach made on one bank’s online workload just by eliminating the user pull of their account balance to pushing the information out on a regular basis to mobile devices.
The concept of pushing for mobile is something that a number of industries should consider when developing applications that could drive up your system load, and because of that your costs. It does depend on your business yet many still are struggling with both online and mobile in banking, retail and even healthcare. With more and more customers adopting mobile devices and using them to do what they used to do only on a desktop, most businesses can benefit for developing useful mobile applications to better serve customers.
It doesn’t need to cost an arm and a leg but it does take careful consideration and outside expertise is often the best way to be sure you are selecting the right strategy. Mobile development and environments like IBM’s own WebSphere aren’t new but are evolving rapidly, sometimes faster than an in-house team can keep up with. Seeking out firms that have solid track records in developing both push and pull solutions, have a strong understanding of the platforms necessary to deliver these solutions and methodologies in place to discover your unique business needs are your opportunity to avoid costly mistakes.
Back in September the FDA issued what it is referring to as “Final Guidance” concerning how medical device regulations will be applied to mobile medical applications. The push has been on to firm up how these will be regulated within the context of existing rules, primarily due to the focus on pro-active healthcare via technology. With the Affordable Care Act still struggling the healthcare industry is wasting no time in looking forward to how they can begin to drive better patient results now that they drive the system, a complete turnaround from when it was the insurance companies that set the standards.
Multiple organizations now exist for the sole purpose of promoting more communications technology solutions for patient care. Many of these were started by the insurance providers, particularly those in the category of known as “provider of last resort”. Non-profit companies like the Blue Cross Blue Shield Network were among the early supporters of efforts like this.
This focus on using application development on devices that have historically not been used in patient care made it necessary for the FDA to give the firms a framework. The “guidance”, which is available here, is described as containing “nonbinding recommendations”. It is very detailed in what parameters you will need to consider when developing patient targeted apps.
Despite the fact that it is described as “nonbinding” it will be taken by most in healthcare seriously nonetheless, particularly as “not following” the guidance could expose them to legal problems if a patient suggests they have been adversely affected by an applications use.
The guidance covers both the device and the platforms that exist to support the app.
In the FDA’s press announcement concerning the final guidance, the first draft of which was issued in 2011, suggests a focused enforcement plan stating “The agency intends to exercise enforcement discretion (meaning it will not enforce requirements under the Federal Drug & Cosmetic Act) for the majority of mobile apps as they pose minimal risk to consumers. The FDA intends to focus its regulatory oversight on a subset of mobile medical apps that present a greater risk to patients if they do not work as intended.
Mobile apps have the potential to transform health care by allowing doctors to diagnose patients with potentially life-threatening conditions outside of traditional health care settings, help consumers manage their own health and wellness, and also gain access to useful information whenever and wherever they need it.”
The FDA is focused on mobile medical apps:
- That are intended to be used as an accessory to a regulated medical device – for example, an application that allows a health care professional to make a specific diagnosis by viewing a medical image from a picture archiving and communication system (PACS) on a smartphone or a mobile tablet.
- Or transform a mobile platform into a regulated medical device – for example, an application that turns a smartphone into an electrocardiography (ECG) machine to detect abnormal heart rhythms or determine if a patient is experiencing a heart attack.
As the director of the FDA’s Center for Devices and Radiological Health Jeffery Shuren, M.D., J.D., says “Some mobile apps carry minimal risks to the consumer or patients, but others can carry significant risks if they do not operate correctly. The FDA’s tailored policy protects patients while encouraging innovation,”
One thing is for sure. Mobile apps will have a big role in healthcare worldwide going forward. As consumer mobile devices become more and more integrated into our everyday lives it is only a matter of time until physicians start tapping into their power. Developing these applications will take a combination of healthcare expertise, firm knowledge of the FDA’s guidance and development experts who can translate that into useful mobile medical applications.
Outsourcing the development will always be a strong option for larger organizations like existing medical data companies, large physician practices, hospitals, and a myriad of yet to be innovated new patient solutions. The issues and complexities involved in developing fool proof apps that work well on a broad spectrum of patient devices require expertise often not available in-house. Low cost outsourced development may well be a way for innovative healthcare companies to begin to take advantage of the rush to capitalize on the power of mobile smart devices to better care for their patients.
According to a recent study by Realtor.com, more than half of the listings viewed online were viewed on a mobile device. They also found differences in which devices used as they found in markets where property prices are high, IPads and IPhones led in usage where in lower priced markets Androids led.
Realtors have always been a group that embraces technology, particularly tech that makes the job of matching buyers with homes easier. Now Realtor.com in its guidance has begun to sing the praises of mobile. There are a handful of propriety apps already on the market and increasingly Real Estate agents are providing video tours of properties via their websites. Virtually all are moving as quickly as they can to responsive webs and are themselves utilizing the same mobile devices and apps that their potential customers are using.
In a piece that recently appeared in the Wall Street Journal, Irvington New York house hunter Ms. Goyzueta and her husband Jason Velez took a video tour before quickly buying their new home. “I was a little on the fence and wondered if I’d really see all the little minute details [in the video],” says Ms. Goyzueta, a 35-year-old film mother of three. “But I loved the property. I loved the location. I was shocked that after the tour, I was ready to make an offer.” She made an offer that day.
Large global real estate firm Cushman & Wakefield is an example of the large early adopters of both cloud and mobile technology as part of a “mobile first” strategy. The strategy included migrating thier 15,000 strong workforce to Office 365 and Salesforce.com for CRM. The also adopted the use of Box.com for storage, Oracle’s on-demand Human Resources service, and a service desk via ServiceNow.com.
Seems like a big change but John McKeown, Cushman & Wakefield’s European CIO, says the secret is to go slow. Part of the reason for that is there were challenges, “One of the biggest challenges has been managing the change within IT in terms of support, security and cost,” stated McKeown. “At first, we empowered users to choose their own devices, layered in a company data plan, and provided best-endeavor support,”. The company also kept it small in device choice when employees preferred the company provide the device by only supporting one version of Android on one device.
As real estate markets continue to show signs of improvement and more firms are engaged in reaching out to prospective customers, mobile applications and webs will be key to their success. However most don’t have the resources of a Cushman & Wakefield and not all of the “off the shelf” solutions will be useful in all markets. Branded applications continue to add value in multiple industries like manufacturing and logistics, so it is likely large and mid-sized real estate companies will need to build solutions specific to their needs.
The good news is there are many well structured and knowledgeable outsourcing firms that can help these companies develop mobile webs, tools and applications that will drive sales. Often these firms are located nearby yet utilize programming resources located in areas where coding costs are lower. By taking advantage of these sources for application development it won’t cost you a ton of money to make your Real Estate business more successful and your properties easily found by the right buyers.
Red Hat, Inc., arguably the largest corporate contributor to the Linux open-source project and founded in 1993, is responsible for Red Hat Enterprise Linux. Dell, the struggling PC integrator, was once a powerful force in PC sales both for businesses and consumers.
Well not really. Dell has been a supporter of open-source and OpenStack for some time now and the relationship between the two companies is over 14 years old. The new deal however makes Dell the first company in history to OEM Red Hat Enterprise Linux OpenStack Platform (ELOP). The jointly engineered product is to be constructed completely with Dell infrastructure elements and utilize Red Hat’s OpenStack platform within Dell’s Cloud Services.
For Dell it is an opportunity to start selling an offering that is tailored for large enterprises wishing to establish private clouds. This follows Hewlett-Packard’s (HP) announcement that its newest version of CloudSystem will utilize the HP Cloud OS which itself is based on OpenStack.
Radhesh Balakrishnan, Red Hat general manager for virtualization and OpenStack, was bullish on the the deal at this year’s Dell World conference saying ”There are definitely parts of OpenStack platform that still have a ways to go in terms of maturing,” he said. “The core parts of the software, however, are operationally solid and Red Hat has worked to bring these components to the level of an enterprise grade implementation. We fundamentally believe that OpenStack will be the new data-center fabric of the future,”.
The joint effort will provide for a real contender going after IaaS share currently held by Amazon, IBM and HP but only time will tell if it truly drives adoption of OpenStack in the enterprise private cloud market. It doesn’t however necessarily aid smaller organizations to meet similar goals. Aimed at the big enterprise market it is not going to be an inexpensive effort.
Developing systems and resources that IT teams can deploy at much lower cost is still the job of both in-house and outsourced development teams. Often companies find that they don’t require everything that can be done but rather more specific database tools or collaborative systems that meet unique business needs. A “one size fits all” option usually cost more and drives up end-user learning curves.
Thankfully, if your’s is a company that has more specific needs but lacks the in-house prowess to develop it, there are companies available as outsourced assets and many offer reasonably prices services for augmenting your staff or doing the development outright. Most have deployed systems and applications in enterprise environments just like yours for many years, giving them expertise you will need to keep costs down and potential for success up.
The move to join forces is a clear indication of where these two companies see the future going. For many end-user organizations this may just signal that it is time for them to examine how they are going to respond to an increasingly tech driven future.
If you listen to Microsoft, SkyDrive Pro combined with either Sharepoint 2013 or Office 365 is the best thing since the invention of the Internet. For enterprises this may eventually be true but there are concerns from many sides about security.
To make matters worse the clock is ticking on Microsoft to get reasonable penetration into the enterprise cloud storage market. Currently Box.com, with a focus that started on the enterprise, holds the major portion of the market now. Dropbox on the other hand is securing hundreds of millions of dollars in financing specifically to break into the enterprise market.
Microsoft first developed the original SkyDrive cloud service, part of the company’s new device and service strategy, to support storage on Surface tablets and other “space” challenged devices. It now is automatically installed and enabled by default in WIndows 8.1 as a native component. SkyDrive cannot be uninstalled and furthers the operating system’s advertised “deep cloud integration”.
The problem for IT decision-makers is centered around the confusion that the choice of names created with the addition of SkyDrive Pro. The SkyDrive on your Surface tablet isn’t the same animal as SkyDrive Pro when it is install via Office 2013 or the available desktop standalone install.
The service is more like Dropbox, although many complain it is not as user friendly, and allows you to synchronize document libraries and other resources over multiple divices. It also isn’t necessarily a new concept. SharePoint, SP 2007 had a similar feature called “Office Groove” and in the 2010 release it was re-dubbed “Workspace”.
SkyDrive Pro however is a freshly baked approach that models itself far more after Dropbox than it does its predecessors. According to Redmond Magazine contributor Benjamin Niaulin “SkyDrive is a critical addition to SharePoint 2013”. He points out that although many compare SkyDrive Pro to Dropbox, thus raising those dreaded security concerns, for the enterprise “SkyDrive Pro is a different story.”
Microsoft has been pushing for users to not natively customize their SharePoint 2013 deployments, as many have in past versions, but rather focus any customizations via APIs and utilizing an application approach. That, combined with the enterprise focused security measures in SkyDrive Pro, is a nod to the need to satisfy the growing integration of often multiple mobile devices into everyday business life.
Now as companies look at their future SharePoint plans, they will need to consider carefully business driven application development in this new approach and how to maximize the usefulness and security of SkyDrive Pro. Many firms may find themselves unprepared to make the switch in how to support this sort of application development. However, thanks primarily to the widespread use of SharePoint in both mid-size and large enterprises, there are resources that can help.
A number of well educated development teams with years of SharePoint experience are available that can support your application development efforts. Many outsourcing firms specialize in providing expert consulting throughout the planning, development, deployment, and after deployment management of SharePoint 2013.
Ultimately, you and your IT team will need to make the hard decisions about how to get the most of either SharePoint or SkyDrive yet both are adding value to businesses all over the globe. Something to consider as you look forward at your IT future.
In addition to Salesforce CEO Marc Benioff’s announcement about “SuperPod”, which we wrote about earlier this month, was the launch of the long awaited upgrade to SF deemed “Salesforce 1”. Although not a completely new product it is a consolidation of several existing Salesforce products including the core CRM platform, social tools Chatter and ExactTarget, the marketing cloud, and multiple built-in third party applications.
One interesting addition is support for further development of tools to garner data from customer devices equipped with embedded sensors. This is the sort of location based data, particularly those that lead to purchases, that fuel the ability to serve up personalized product offerings based on now observable customer habits. This feeds off of the rapidly growing interest in the “Internet of Things”, where virtually everything would be in some manner connected to the web.
Benioff underscored this in an recent earnings call when he stated, “You have got to be ready to sell, to service, to market to your customers regardless of where the customer touch point is. This is really the Internet of customers, where companies connect to their customers through the next generation of devices, apps and services,”.
Part of the stack that Salesforce 1 creates has the Heroku cloud service riding along with the core components of Force.com and ExactTarget FUEL. Heroku was acquired by Salesforce in 2010 and it provides the a means for building and the deploying online applications directly into Salesforce ecosystem.
As Klint Finley of Wired Magazine puts it “Heroku offers what’s known as platform-as-a-service. Basically, it’s an online tool that lets you build and deploy large online applications without having to worry about the hardware and software infrastructure that keeps them going. Salesforce also runs Force.com, a visual tool for creating and integrating applications on the Salesforce platform, but the Heroku service is something that’s far more attractive to hardcore software developers.”
Although the launch of Salesforce 1 marks a giant leap for the company it will require new and more specialized skill from IT teams to get the most out of it. The capability to design and deploy applications specific to your business or your audience isn’t often available in existing staff. Hiring folks with more specialized skills may be an answer but it is one that may be out of budget reach for some companies.
For these organizations there are options. Outsourced development has mature greatly over the years and a large number of teams are available that have the necessary expertise. These firms allow clients to use the advanced development resources more as needed to fill expertise gaps or to fill needs on a project by project basis.
Add to that the fact that these teams are generally in areas where personnel costs are far lower than those found in similar North American firms and yet most also maintain North American project management in order to better maintain both the relationship and keep projects on track and on budget.
There is no denying anymore that Salesforce has become a real leader in this industry and with the deployment of Salesforce 1 is the next step in the company’s long term plan.
You might say it is moving like a freight train through companies large and small. In fact if you are not considering mobile applications as part of either your business operations thinking or your marketing and customer support efforts your organization may well get run over by the companies jumping on that train.
Information Week’s 2013 Mobile Application Development Survey contains a significant amount of data that shows that more and more IT departments are getting marching orders that include the rapid development of both web based and mobile applications. Understanding the breakdown of the respondents helps to put it in perspective. 28% have 5,000 or more employees; 21% are over 10,000. Education, consulting, and financial services are well-represented, and 28% are IT director/manager or IT executive management (C-level/VP) level; an additional 9% are non-IT executives (C-level/VP) or line-of-business managers.
Some highlights from the responses:
- Tech leaders that responded to the Survey stated clearly that they are reving up more app development, not less.
- Android phones edged out both iPhones and iPads among platforms in use or under evaluation, cited by 78%; that’s up 12 points, from 66% in 2012.
- 59% say coding and UI work are being done in-house versus 18% using external providers.
- When specifying devices and platforms custom apps must support, IT and the business are in sync: “It’s a collaborative decision” was the No. 1 choice, up seven points.
- 28% have no plans to develop browser-based mobile-optimized apps in the next 12 months; 30% say the same about native custom apps.
Almost an even number of the respondents indicated that they wouldn’t be developing in either native or browser based modes. The native development camp sights security and device specific performance as the rationale for “going native” while folks focusing on browser based, device agnostic applications want the advantage of cross platform capabilities, faster development times and ease of deployment.
In considering the report industry commentator Andrew Murray poses the big problem companies trying to keep up have. He states in his Information Week review of the report that “One reason is that many companies have been building Web apps for a long time. They either have experienced in-house developers or can tap a robust market of third-party developers. Mobile development poses a new set of challenges. According to respondents, complex code development is their biggest challenge when it comes to mobile, followed by cross-platform compatibility and by finding or nurturing expertise.”
Although the report still showed much of the development being in-house it clearly identified shortcomings in those resources. With the growing availability of inexpensive outsourced development services the rush to create applications might drive more companies to look for outside help. You can expect that 18% outsourced development number to grow quickly as the pressure to meet the new demand continues to explode. Outsourcing some or part of your application development will aid in filling experience gaps without driving overhead beyond your present needs. It also gives you the flexibility to grow and contract your development expenses as needed and focus your in-house team on more critical projects or those that require specialized in-house expertise.
Companies will also find a rich set of resources in the form of development firms that have already been down this road for others. By tapping into their understanding of both your business or industry and the best ways to move that to mobile devices will save you time, money and perhaps even make the difference in whether you win or your competition does.
The evolution of Salesforce from application to platform has been dramatic with their “cloud” growing in size, capability and available third party support. Now one of their largest clients is joining forces with them a deal that has many pundits scratching their heads.
The new partner is HP, the computer integration giant, and is also Salesforce’s biggest customer. In a deal announced on November 19th the two companies are creating a service aimed at organizations with extensive IT infrastructure and concerns about security and regulatory compliance in an “all cloud” platform.
As part of the deal HP is putting dedicated computer servers, data storage and networking into Salesforce cloud-computing facilities. This is in effect creating dedicated HP systems that would be configured to the specific needs of the client organization. This is something that Salesforce is currently unable to do via its main cloud.
Marc Benioff, Salesforce’s co-founder and chief executive, notes “The reality is, this is a whole new vision of cloud computing, it’s a public, private cloud — public cloud services but with dedicated hardware. Speed, with control.”
The new product dubbed “Superpod” has been designed with big customers in mind. These are customers that reflect a growing trend, “I think there is a lot of push from big clients to be treated differently. Something that I see in our discussions is how you can still preserve the positives of a cloud portfolio, like agility and fast releases in innovation, though have a private addition,” according to Sven Denecken, SAP’s VP of Cloud Solutions Strategy, “There are clients who want that. I don’t think cloud computing has made the life easier for companies and for IT because landscapes are heterogeneous, they get more heterogeneous.”
In many ways this effort bucks some of “the clouds” traditional thinking. Primarily there it has been that any kind of equipment in the cloud can be used in order to get the most out of the gear and that the only thing that counts is the software that connects it all.
Superpod has HP stacks sitting next to more traditional cloud resources but designed to be a single tenant, rather than the more multi-tenant setup in most clouds. All involved believe that organizations, like banking or healthcare for example, will be among the first to target with this type of infrastructure as it alleviates their concerns about security and regulatory compliance.
Nonetheless, big or small, the advantages of particularly the growing prowess of Salesforce as a platform show that even mid-sized companies can do more than ever before with the platform. It also points to the need for solid expert input into your adoption of Salesforce. As it is a platform with a great deal of flexibility for customization, development resources exist that can help you maximize your use of SF.
For small to mid-size companies you can reduce the cost of the resources by employing outsourced services. Development vendors with “in country” project managers and offshore development teams, can often conduct your projects with the highest level of expertise but at costs that reflect their large pool of low cost development personnel.
The SF/HP deal is a new wrinkle in the cloud universe but one that only time will tell if it adds sufficient value for customers despite its “slightly higher price”. If it is worth it, we will know soon enough.
As of early November 2013, Oracle will stop offering commercial support for new version of the open-source application server GlassFish. This leaves customers who require their support with only the option of moving to WebLogic Server.
This comes just prior to the release of GlassFish Open Source Edition 4.1 in 2014 but they will not be releasing a commercial version of the product they acquired when Oracle bought Sun Microsystems.
Current customers with GlassFish Serve 2 or 3 will continue to receive support but this will leave a short term gap for some as WebLogic will be the only Java EE 7 compatible option. TmaxSoft has its JEUS 8 application server and it has achieved Java EE 7 compatibility but it isn’t scheduled for release until next year.
Oracle is strongly suggesting the move to WebLogic Server so that:
- Applications developed to Java EE standards can be deployed to both GlassFish Server and Oracle WebLogic Server.
- GlassFish Server and Oracle WebLogic Server have implementation-specific deployment descriptor interoperability.
- GlassFish Server 3.x and Oracle WebLogic Server share quite a bit of code, so there are quite a bit of configuration and (extended) feature similarities. Shared code includes JPA, JAX-RS, WebSockets, CDI, Bean Validation, JSF, JAX-WS, JAXB, and WS-AT.
- Both Oracle GlassFish Server 3.x and Oracle WebLogic Server 12c support Oracle Access Manager, Oracle Coherence, Oracle Directory Server, Oracle Virtual Directory, Oracle Database, Oracle Enterprise Manager and are entitled to support for the underlying Oracle JDK.
One option for companies affected who don’t want to pay the considerably greater cost of WebLogic may wish to secure the services of an outsourced firm who can offer similar management and development support for GlassFish and its future permutations. Still others who do want to make the move to WebLogic and seeking the outside expertise of firms that specialize in supporting the migration of application server types is likely going to save you both time and money.
A number of firms in North America maintain local project managers that take advantage of low cost development and system migration expertise found in offshore outsourcing. These companies have the ability to maintain their relationship throughout the project with managers you can meet with face to face and yet still take advantage of lower staffing costs available with offshore technical teams.
If a recent survey conducted by Ovum, an independent research firm focusing on primarily biotech and pharmaceutical companies, indicates that virtually all intend on increasing dramatically their spending on business intelligence (BI) and analytics tools.
The survey drew upon the input from life science IT executives and when asked about their investment plans for the next 18 months more than half indicated they intend to install all-new performance management systems. Many are also planning on a complete replacement of database query tools and predictive analytics.
Ovem analyst Andrew Brosnan points out that “although there is an increa
se in budgets for analytics, for many life science companies these are new technologies, and they will thus require guidance on how to implement and use them in order to gain maximum [return on investment],”
We are not talking about small change either. In the report they forecast that the life sciences industry will spend about $40.8 Billion by 2017. With 60% of the respondents seeing clinical research IT as a “top-three priority” over the next year and a half. 30% also noted that it is a concern “due to the competitive advantage it can bring in improving the efficiency and productivity of drug development.”
Brosnan sums it up, “These survey results reflect how the larger trends within the industry are affecting pharma IT investment in the US, such as outcome-based reimbursement plans, lower operating margins, mandatory price cuts, and personalized medicine. These factors are driving the pharma industry to glean more insight from large and more diverse data sets to drive down the cost of drug discovery and development.”
As he mentioned most of these organizations are not prepared for the need, nor do they have the kind of in-house expertise necessary to accomplish the task as quickly as needed. These are new territories for even the most capable life sciences exec with cost of staffing and necessary outside expertise being a high concern.
Organizations facing these challenges can consider working with outsourcing firms who maintain local project experts and utilize lower cost development, migration and implementation teams offshore. Many “off-the-shelf” tools already exist and these outsourcing companies have made using them in custom applications easy and cost effective.
For companies facing the challenge of keeping up, BI will be a factor. Now there is at least one set of surveys that show that there are many in the life sciences industry who believe that to be true and are willing spend the money necessary to win in this space.
For many years Facebook has been building its own machines using an approach that ensures they pay only for what their specifications require and strips out the hardware and software features not needed for FB’s specific applications. It is this approach, which they have employed to build out their own network has led to a joint effort by FB, Intel, Broadcom, Mellanox together in the “Open Compute Project”.
The aim of the project was framed roughly two years ago with Facebook acknowledging the success of using open platform designs to transform energy efficiency in global data centers.
According to a release from PR Newswire issued at the launch of the project in 2011, “Facebook and our development partners have invested tens of millions of dollars over the past two years to build upon industry specifications to create the most efficient computing infrastructure possible,” said Jonathan Heiliger, vice president of technical operations at Facebook.
“These advancements are good for Facebook, but we think they could benefit all companies. Today we’re launching the Open Compute Project, a user-led forum, to share our designs and collaborate with anyone interested in highly efficient server and data center designs. We think it’s time to demystify the biggest capital expense of an online business — the infrastructure.”
Spin up to today and we see that the member companies of the project and some other vendors are beginning to release testable designs for an “Open Compute Switch”, a top-of-rack bare metal switch with a boot loader that project partner Cumulus Networks developed. Broadcom introduced and built a switch based on the Trident II chip architecture, while Mellanox put forward its SwitchX-2 switch, and Intel offered a specification for a switch that Quanta and Accton have been building. The three switches are now being tested in Facebook’s labs.
Hardware vendors like Hyve Solutions have released designs to meet the server needs of the project. Now the arrival of a webscale-approved open-source switch is just around the corner, six months after the Open Compute Project announced plans to come up with such a switch.
But contributions to the project now go beyond hardware. Cumulus Networks, which delivers a Linux networking operating system, is providing the Open Network Install Environment for the Open Compute Project. The idea is to be able to run different operating systems on networking hardware, said JR Rivers, CEO and a co-founder of Cumulus.
If successful it could provide alternatives to alternative networking vendors like Cisco, Arista Networks and the struggling Dell Force 10 division. The major reason is that the project promises to deliver a “specification and a reference box for an open, OS-agnostic top-of-rack switch.” Whether that reference box will be based on an amalgam of submitted specifications or just one of them isn’t clear yet, and no release date has been set.”
Cisco seems unimpressed so far by the project and has itself become involved in another “open” oriented project that may offer attractive alternatives to the Open Compute. The project “OpenDaylight” and others are their way of not letting another idea slip by them as they did with Software Defined Networking at first. Their hope is to avoid getting behind others and having to play defense with their market.
The beauty of the open concept is that it frees developers and IT teams to keep both costs and energy use down within these new data center concepts. It also allows the already existing community of open source developers to provide expertise and lower cost development resources, and bypass the more expensive proprietary system vendors. It also means that equipment specs will no longer “one size fits all” but rather a size that fits your organization’s needs.
There are a number of well qualified open source experts that are well versed in all aspects of data center operation, data warehouse design and integration of new open source technologies. Several of these are located offshore and typically the better ones have U.S. teams or project teams in multiple countries to make ongoing contact with clients efficient and productive.
Given Facebooks now impressive network, the way they have shown the way for many other providers and customers, you can bet more and more “open” is in our future.
It seems you don’t read quite as much hype about “Big Data” these days but some industries are diving in with both feet. Healthcare has even adopted its own use of the term “Big Health Data” as more and more initiatives begin to harness the huge amount of “unstructured data” that is produced daily by physicians, clinics and hospitals.
One example here is the newly launched clinical data warehouse, the nation’s first, that tracks health records of 3.2 million South Carolinian’s. Developed by Health Sciences South Carolina the effort was a collaboration between South Carolina’s top universities and hospital systems.
According to the Charleston Regional Business Journal the “Clinical Data Warehouse links and matches anonymous electronic patient records from the state’s largest health care systems to enable providers and researchers to follow patient conditions in real time. The data warehouse also allows biomedical researchers to conduct patient-centered outcomes research and comparative effectiveness studies across a much broader and aggregated patient population base.”
The deployment of a well designed and implemented data warehouse is a first step in mining and using real-time big data flows. Some systems exist “off the shelf” and still more are being developed “in-house” or through collaborative efforts like those of the South Carolina health systems. Teradata recently announced the introduction of their cloud based solution that provides data warehousing by way of a “Software as a Service” model.
According to Alex Giamas, writing in a recent edition of InfoQ, the Teradata offering “From a technical standpoint, at the current stage Teradata Cloud fares closer to Amazon Redshift than any other cloud offering. Teradata offers physical dedicated hardware in its cloud, which is appealing for large corporations that can’t have their data collocated with other datasets. Teradata’s selling point for the Teradata Workload Manager is also a richer set of features as compared with Amazon’s Data Pipeline offering.”
Oracle is also approaching the big data need by extending its capabilities as well. Oracle has now extended that platform approach to Big Data to include an Oracle Big Data Appliance X4-2 offering that makes use of the latest generation of Intel Xeon processors and 4TB magnetic disk drives.
George Lumpkin, vice president product management for Oracle data states that the tech giant views Hadoop and other forms of NoSQL as a natural extension of the data warehouse. Those data warehouses, notes Lumpkin, already contain customer information in a structured format that often stretches back across multiple decades. Hadoop and other technologies now allow those data warehouses to be extended to include all types of unstructured data, including clickstreams and all manner of machine data.
For many companies getting to the data and utilizing it in the way that these data warehouse systems do will find that it can be done at a cost your organization can afford, even if you don’t have the resources of an entire state or a staff full of data warehouse experts. A number of firms exist that utilize design and development teams that are less expensive, short term resources for building your “Big Data” warehouse.
Utilizing U.S. based project management and offshore development, a data warehouse and the systems to utilized it can be deployed at a fraction of the cost that mainstream providers charge. As the emphasis for many years in these development communities has been on technical proficiency with both off-the-shelf systems and the customization of entire data warehouses based on customer need.
One thing is absolutely clear. Now that the hype has past Big Data has become an objective that companies take seriously and continue to look for real solutions for taking advantage of a growing resource.
When most of us picture a “remote worker” or someone who works from home for a larger company we think of a desktop with maybe some access to company applications and files. After all, the use of “virtual desktops” itself isn’t that new. It is, however, advancing rapidly as a result of both advancing technology and the access to faster Internet connections.
The bane of remote workers was once that access speeds were so low that it was best to use a “thick client” and offer only minimum applications. For most it was to interact with company email and have access to company and personal work files. Now most users enjoy Internet speeds are significantly faster and, more importantly, mobile. Mobile is something that provides yet another wrinkle to serving the remote user.
Global online tech publication The Register’s Chris Mellor is fairly certain that the improvements in technology spell the “death of the business PC”. His belief is grounded in a handful of new technologies and offerings, a dash of Bring Your Own Device (BYOD), and the “pricey and complex to manage business desktops”.
The technologies he cites as cornerstones for Virtual Desktop Infrastructure (VDI) eventually killing the desktop are:
- EMC’s XtremIO all-flash array, twinned with VMware View or Citrix Xen Desktop and backend VNX arrays, can support 7,000 or more virtual desktops
- All-flash SolidFire arrays can support large-scale, 1,000+ VDI roll-outs
- Atlantis and all-flash Violin arrays can support 1000s of virtual desktops
- Hybrid flash/disk array startup Tintri can support 1,000s of virtual desktops
- Startup Pivot3 supports VDI
- There is a VDI-focused Vblock from VCE
Some companies have been utilizing forms of virtualization for years, generally with “fat clients” for end users, but increasingly they are seeing value in migrating to newer technologies.
Anthony Smith, describes his companies migration, “The move from in-house to hosted infrastructure is a formidable challenge. Running your server estate on someone else’s kit, in their data center, makes every user a remote user, and for us, some kind of virtual desktop technology seemed an obvious choice. We spent a large amount of time investigating various VDI and session-based solutions, considering all aspects from the “user experience” through to technical and commercial considerations.
In the end, the choice of desktop technology depends heavily on the specific use case and application stack you need to deliver. For us, Citrix XenApp 6.5 on Server 2008 R2 was a better fit than VDI, for the following reasons:
- Less resource intensive than VDI, because there’s only one copy of the OS for multiple users, greater densities of concurrent users are achieved and storage requirements are reduced.
- Efficiency – the mature ICA/HDX protocol delivers low bandwidth consumption with a rich user experience.
- Flexibility – both XenApp and RDS enable individual applications as well as full desktops to be published.
- Simplified management compared to VDI.
- Familiarity – an update of skills rather than a complete new learning curve was required.
- Scalability – running XenApp servers on VMware enables servers to be scaled out easily.
- Less complex to deliver and lower TCO than VDI.”
Smith also emphasized the importance of a well rounded team. He says, “Development of the new platform involved bringing together a project team that included our in-house team, specialists from the hosting provider, and external engineers with Citrix skills. In addition to putting together the right tech team, the success of any sizable IT project depends on getting users to buy in and take ownership, therefore we engaged with users from all areas of the business at an early stage in the process.”
There are multiple options for finding the kind of expertise you need for moving to a virtualized system but many companies are not equipped to do it on their own. For those organizations outsourcing some or all of the planning, development, deployment, and operation of a new VDI system. Many outsourcing firms have locally based project managers to ensure that projects meet the client’s expectations and offshore development resources to aid in keeping costs down.
As more companies use technology to improve their competitiveness it will be important for even small to mid-sized firms consider whether or not VDI will save them money and improve their “bottom line”.
Between the release of Microsoft Dynamics CRM 2013, to the new capabilities in the upcoming Microsoft Dynamics AX 2012 R3 update to its flagship ERP product, the big M is trying to delivered something for everyone…everywhere.
And by everywhere we mean both feature improvements that affect all users, some that are country specific and the ability to deploy on Windows Azure. The latter allows companies to choose between an on-premise deployment or a private cloud. With the addition of multiple new apps and the upgrade of several others Microsoft Dynamics is supporting the use of their powerful systems on the go.
From accounts payable to marketing and human resources to inventory management there are too many new features to list here. In general all the new features are aimed at streamlining processes and providing industry specific process templates that further enhance the usefulness for the nearly 100,000 customers of the product.
Belgium users wil gain the ability to generate electronic payment files for Single Euro Payments Area (SEPA) direct debits. Chinese users will be able to update and maintain the transfer history for a fixed asset that you transfer from one location to another. Finlanders will be able to generate electronic payment files for Single Euro Payments Area (SEPA) credit transfers and folks in Denmark will be uploading and maintaining European Union (EU) entry certificates for items or services. There are seventeen specific countries gaining new features several with multiple new features.
As for the release of Microsoft Dynamics CRM 2013 it too has become more mobile, added multiple apps to aid in its mobile use. At their 2013 Global Premiere in Barcelona the company announced a set of 18 new predefined and configurable process templates that include sports management, healthcare, government and nonprofit, as well as some specialized areas such as prison offender management.
Delivering on the promise of giving customers access to their CRM information on any device, Microsoft also confirmed the availability of new touch-optimized experiences on Windows Phones, iPhones and Android phones that give customers powerful functionality and analytics on the go, all without a separate license fee.
This continues Microsoft’s push towards integrating all of its business related applications and systems, as well as, making them both cloud ready and accessible by mobile device. Nonetheless and despite the huge positive impact this creates for businesses that use these tools it makes the IT exercise a bit harder. Planning for their deployment, customizing applications or even the creation of need specific applications often call for broader experience and skill levels than are available with existing staff.
If your company is considering taking full advantage of these powerful new features soon to be available in the Microsoft Dynamic universe you may want to consider outsourcing part of the planning and execution of these projects to a company that has deep experience in these types of upgrades and deployments.
Often there are U.S. based project teams that employ offshore programming and development personnel focused specifically on supporting systems like Microsoft Dynamics, Salesforce, and other CRM/ERP systems. Taking advantage of onshore experts to guide the offshore resources, companies can support the deployment and custom development requirements at costs far lower than if all development is U.S. based.
One thing is for sure, Microsoft once again is not sitting still and with these new features and updates their flagship CRM/ERP products are becoming formidable options for particularly smaller and mid-size companies.
In recently conducted research by Vanson Bourne, an independent global market research firm, 48% of testing and development projects are outsourced by CIOs. Yet the study revealed that the change in dynamics involved when your IT becomes more of an outsourced management environment rather than a fixed internal team causes some projects to fail.
Many times these failures appear to be the fault of the outsourcing provider. Yet when more closely and objectively examined even the CIOs themselves soon learn that they need to change their management methodology not change their outsourcing partner.
For example the report found that 81% of those polled suggested that they were “not totally confident” in telling an outsourcing company exactly what they wanted, yet the biggest reason (55%) projects overran or failed to deliver was ‘too many changes occurring to the requirements during the project’.
Other key reasons for projects hitting the skids included a lack of skills to interpret requirements (47%) and insufficient testing (40%). 85% of organizations use a “variety of stakeholders” in sorting out requirements resulting in increased complexity, whilst only 37% said they could manage variability and change during these contracts.
In an unrelated survey of IT decision makers in the UK, France, Germany and South Africa 52% of the respondents indicated that their ability to innovate was being compromised by a lack of budget dollars. Similar U.S. statistics on budget constraints also indicate that budgets continue to be tight despite organizational needs for new and innovative uses of technology.
So what’s a CIO to do when they don’t have big budgets but they do have big challenges? The answer is still outsourcing.
It may not mean outsourcing everything but in companies that are fundamentally changing how they meet these challenges there is one model that is critical to consider so that your projects have a better chance at success.
Lois Coatney, in a comprehensive piece for CIO Insight, has some excellent suggestions for creating an “outsourced operations” environment for CIOs and their staffs. She points out that “The CIO’s team also changes dramatically after a transition to outsourcing. What had been a single staff now comprises a number of service providers within the supply chain. As a result, the manager’s responsibility shifts from assessing individual performance to assessing overall service compliance to agreed service levels in the contract. Similarly, incentives to change behavior must be focused on service-level improvements rather than individual staff improvements.”
It is also helpful to select outsourcing partners who have a well established and documented method for engaging and satisfying the requirements of client projects. There are a number of these outsourcing partners that have deep roots in both new technologies and legacy systems. Add in the fact that the best of them maintain both low cost offshore development assets, as well as, in country based personnel to build and maintain an effective outsourcing relationship, companies with tight budgets can still meet their organizational needs.
CIOs and other IT management will eventually need to provide innovation at a low cost. One way is to in-source the management of outsourced assets by utilizing your more specialized in-house staff to effectively to manage projects to success while keeping overall development costs down.
Many a CIO’s job is on the line now to move their organizations into a tech driven future. Doing a better job with outsourcing may well be the key to keeping those jobs.
Although in the open-source content management system (CMS) world WordPress still maintains a significant lead with 19% of all websites running it, number two ranked Joomla! at 9% is striving to offer advancements that may well help them close the gap. After all 9% still represents 45 million downloads and installs of a well respected open source system.
Although many Joomla developers have been creating responsive websites one of the primary draws for 3.0 is the fact that it is designed to be inherently “responsive”. Responsive websites are built to be device agnostic as it relates to the size of the interface. Unlike natively programmed applications designed to run on specific mobile phones, a responsive website adjusts its look, size and, with some, how navigation operates. It scales the website to fit the device.
Increasingly, companies are recognizing the need to have webs that cater to mobile devices but most don’t want to have multiple websites to meet that need. With Joomla 3.0 and properly designed templates, companies can meet the critical need to be mobile friendly and only maintain a single website.
The release of 3.2 brings over 600 improvements, bug fixes and new features including the incorporation of Twitter Bootstrap, LESS CSS, IcoMoon (a library of retina-optimized icons) and a fuller use of JQuery and Mootools. This will give developers all the tools needed to build fully functioning, mobile device ready websites and simpler, shorter development timeframes.
According to Joomla’s sponsoring organization Open Source Matters president Paul Orwig, “Joomla is growing at a furious pace, with more than 45 million downloads to date — and a broader audience means it’s more important than ever that the latest version be as simple, stable and straightforward as possible,”
– Version Control – Enables a Joomla administrator to revert to an older version of content on a website and compare side-by-side to see what changes have been made.
– Rapid Application Development (RAD) – RAD simplifies the process for a developer to write their own Joomla extensions; apps that enhance a Joomla site like calendars, videos or polls.
– Extension Finder – This simplifies the process of searching for and installing Joomla extensions by allowing users to do all that directly from within their admin view with just a few clicks. The previous processes included separate steps to first search for and download extensions to a desktop, and then log in to the Joomla admin view and install those extensions.
– Bcrypt – A strong encryption method that scrambles passwords stored in a Joomla website’s database.
– Two-factor Authentication – In order for a change to be made to a Joomla website, there can be an optional requirement for a user to fill out another form of authentication (other than a username/password). Two-factor authentication can be anything like entering an auto-generated code that is texted to a person’s mobile phone.
– User Interface Improvements – The new Joomla 3.2 administration interface is more streamlined and task-focused.
Other important new Joomla 3.2 features include but are not limited to an improved template manager, multi-lingual site installer, improved WYSIWYG editor, AJAX interface, micro-data semantics library, and HTML form fields and attributes.
Firms wishing to take advantage of the power this new platform provides will find that a development company that has a long track record with Joomla! is your best bet for moving or building your next web effort. Having developed and deployed multiple generations of the platform for previous clients has given them the necessary preparation to take full advantage of the advancements 3.2 brings to the table.
Several offshore development teams are available, generally work at a lower cost per hour, and have been developing open source platforms like Joomla for years. Their deep familiarity with both the platform and the open source development communities that generate these content management systems, make them an affordable and expert resource for your next project.
Currently SharePoint is still king of the Intranet for many enterprises and mid-sized companies. However, making the move from on-premise deployment to one of the new cloud based alternatives is going slower than expected. In a recent report by Forester entitled ” SharePoint: Solid In The Enterprise, But Missing The Mobile Shift” surveys conducted earlier this year show a distinct lag in online adoption. It indicates that part of it is the perception of SharePoint as on-premise solution, part because of security and privacy concerns as well as worry that customization in the cloud is too restrictive.
One interesting aspect of the report was the fact that 79% of the respondents were running SharePoint 2010. The survey also indicated that only 28% stated that moving to some form of cloud-based SharePoint was “never a consideration” which is down from 38% in a similar survey set in 2012. The survey also indicated that 66% of those surveyed were planning on moving to SharePoint Server 2013 within the next 12 months.
Yet a recent Redmond Magazine survey indicated that the delay in cloud adoption may in part be companies trying to decide which Microsoft cloud platform to use. They for the most part know that a large part of their end-user base is going to want to be mobile and currently SharePoint’s mobile performance is lackluster at best. Of the two options, Office 365 or Azure, Office 365 has the majority of adopters with 14% using Azure. 15% of end-users had SharePoint cloud deployments housed in a variety of other cloud providers.
Although in this particular survey 55% percent indicated that they will opt for Office 365, a respectable 29% said they were eyeing Azure. As one of the authors of the Forrester report, analyst John Rymer notes “People don’t do customization of SharePoint Online using the old method because the product limits what they can do. Integration, for example, is pretty limited, and Microsoft will not accept ‘just any random code’ and the rules indistinct.”
As some companies already have built custom apps and desire continued access to management capabilities or specific types of integration with other internal systems, they are likely to take a pass on 365 in favor of Azure. As Jeffrey Schwartz points out in his article, “While smaller organizations are the most obvious candidates to go to SharePoint Online Office 365, especially if they don’t have a collaboration solution, larger shops have more complicated decisions to make. Whether or not larger shops are using Office 365, Windows Azure or third-party infrastructure-as-a-service (IaaS) or managed services providers (or any combination of those), the largest trend is toward hybrid implementations where they are adding capacity to existing SharePoint infrastructure incrementally.”
Making the choice requires a high level of expertise and a methodology focused on evaluating the business needs that drive the use of SharePoint. For some there will never be a need for a cloud deployment and for others it will be the only way to go. These same experts are likely needed to make the move from SharePoint 2010 to 2013 as there are a number of changes, many of which are aimed at getting SharePoint more mobile, that must be considered and dealt with during the upgrade.
Taking advantage of both the expertise and lower cost that an offshore development company focused on SharePoint is something that the 66% planning on making the move should consider. These companies have well trained development teams, U.S. based project managers and can supplement your existing developers at extremely low hourly rates. These companies have diversified over the last few years to offer highly educated staff and well developed methodologies for planning, deploying and supporting users when moving to the new platform.
Not quite as noisy as its competitors, IBM has been methodically growing its share of the application server market that as of 2012 IBM WebSphere had 60% of the market, easily moving it to being a defacto standard. On this trajectory they could easily own 70% of the market in the next two years.
IBM feels it is strong in this market and notes that success in a release last month is because “Mission critical application servers are needed in the enterprise to support scalability, reliability, and security. More light weight open source application servers have a place in the market for web presence software, but for a solution that involves transactions intensively and has the downside of losing significant revenue if the site is down the mission critical servers are needed.
IBM WebSphere application server is a proven, high-performance transaction engine that can help build, run, integrate, and manage dynamic web applications. The IBM WebSphere application server Liberty profile option and development tool options extend the mission critical aspects of the system. Intelligent management capabilities minimize end-user outages and maximize operations monitoring and control of the production environment.”
IBM like many companies in this space have been preparing for the inevitable flood of business oriented application needs. Some driven by the “Internet of Things” (IoT) where nearly everything in our lives is somehow tied to the Internet wirelessly. Still others are the increased need for more sophisticated tools in the field, tying critical systems together and accessed securely on a mobile device.
Again in their release they need only point to known statistics. “The J2EE application server software market is defined by the ability to build mission critical web sites that support a globally integrated enterprise. Strong growth is anticipated as tablets, smartphones, and mobile devices replace PCs. Mobile devices proliferate with 6.9 billion smartphones anticipated to be installed in 2019.
There are now 6.9 billion cell phone registered, paying users. Portable, mobile systems will expand the Internet at a pace not yet achieved. It is anticipated that the apps market will expand from $24 billion in 2013 to $35 trillion by 2019. This expansion of mobile computing at the device level is nothing compared to what is happening at the machine to machine (m to m) communications, with sensors being located everywhere, and monitoring of those sensors proliferating.”
All of this data was drawn from the a recent WinterGreen Research report which they offer for sale and is an independently driven study of the application server marketplace. Yet much of what they have previewed from the report is consistent with multiple existing application trends and certainly reflects strongly the rising interest in the “Internet of Things” and the new emerging technologies to support it.
Many companies will either need to consider strongly whether they should jump on the IBM WebSphere train or, if they are already riding it, how they should approach planning and developing applications for their enterprise. There are a number of both U.S. and offshore based entities already equipped to aid you with the due diligence necessary for a successful implementation or with application development for the platform. Many offshore companies have built practices around supporting WebSphere in particular and are capable of providing the support at dramatically lower costs.
With IBM once again claiming dominance over a very large market segment, keeping up with its many capabilities will take good business sense and careful planning.
Using OpenStack’s Summit in Hong Kong this month, Red Hat, Inc. has announced multiple new innovations designed to support Red Hat’s open hybrid cloud vision for the future. This includes the availability of a beta version of Red Hat’s Enterprise Linux OpenStack Platform 4.0 (ELOP), advancements to drive OpenShift and a preview of OpenStack-M. OpenStack-M is an open source deployment and management solution for OpenStack-powered clouds that will eventually be integrated into the Enterprise Linux OpenStack Platform.
They also announced the next release of Red Hat CloudForms 3.0, RH’s cloud management platform. CloudForms adds enterprise-grade cloud management tools to ELOP4.0 which adds additional capabilities useful with a number of different infrastructure platforms.
Gartner has already come out in favor of greater OpenStack use in its “2014 Planning Guide for Private Cloud, Data Center Modernization and Desktop Transformation” report published last month. The report notes that “OpenStack-based cloud management solutions are increasingly viewed by Gartner clients as a means to mitigate risk of lock-in and expand vendor choice in private and hybrid cloud deployments,”.
All of this is in support of Red Hat’s collaborative effort with Intel entitled “On-Ramp to Enterprise OpenStack” that they mutually announced just days before the summit in Hong Kong.
Speaking about the initiative Rad Hat’s General Manager, Virtualization states, “Red Hat’s longstanding relationship with Intel has played a key role in the adoption of Linux as the strong enterprise platform we see today. Like Linux in its early days, collaboration among technology industry leaders can help foster enterprise adoption of OpenStack. As Red Hat’s OpenStack vision and solutions gain momentum, On-Ramp to Enterprise OpenStack represents the next level of our collaboration with Intel, and I am excited about the opportunity we mutually face to help demystify OpenStack and show enterprise organizations its true potential.”
For many organizations that want to build either hybrid or private clouds using platforms that have been evolving out of the open source movement for years this is good news. These platforms offer lower costs of entry and greater levels of customization than more turn-key or off-the-shelf options. Add to that the fact that a number of development resources exist to aid in planning, development and deployment of these sophisticated resources.
Many offshore development teams have years of both Red Hat and OpenStack experience offering companies lower cost options for designing, deploying and maintaining their cloud. In most cases these teams not only can aid in developing the necessary components but often can aid in migrating existing systems and applications to the new environment.
Either way Red Hat is continuing to be a force in the advancements made in both traditional enterprise computing and in the deployment of cloud-based services.
There are now so many different solutions being offered for building your own social network, either in a “cloud” online or “behind the firewall” as a business tool, that the list is becoming overwhelming for some companies wanting to deploy one in their business. Many of the more established business cloud services (Oracle, Salesforce.com, Microsoft) are including more and more options for integrating Enterprise Social Software (ESS) into their own business hybrid or cloud solution.
Currently, the global Enterprise Social Software (ESS) market is estimated to grow from $721.3 million in 2012 to $6.18 billion in 2018. This represents a compound annual growth rate (CAGR) of 44.9% from 2013 to 2018. In the current scenario, High Tech & Telecommunication verticals continue to be the big adopters and North America is expected to be the biggest market in terms of revenue.
Business itself has always had a “social element” to it, so much so that many in the past have complained that too much decision making is made around the water cooler or in a hallway conversation that they often are not communicated well or built with enough input from others.
Many businesses have difficulty getting folks even within a single building to communicate well with each other. Yet the advent of social networking in the form of sites like Facebook, LinkedIn, Twitter etc. have laid the groundwork for easy adoption of social-like tools within a business environment.
There is a fine line these days between what are commonly considered collaboration tools, Sharepoint is both an example and a leader in this industry, and those that have more of a social feel to them. Yet even Sharepoint is often used in conjunction with Yammer in order to provide an experience closer to that of Facebook, which continues to be the model most social software follows.
The real trend though is in the area of integration. Gartner’s Magic Quadrant report on the subject of enterprise social software indicated that companies are moving away from focusing on differentiating features, to integrating it into their larger set of enterprise software.
“Differentiation is getting harder to do and many vendors are starting to have the same elements,” according to Mike Gotta, Gartner’s research vice president for collaboration and social software. “The destination sites are trying to become more purposeful and more contextual around getting work done, as well as more integrated with other vendor offerings. At the same time, social elements are also being added into existing project management and CRM tools.”
He also points out that it is important not to just cram social elements into existing business applications without first understanding if social will enhance productivity when users take advantage of the tools. In fact, he is not alone in recommending that you start small and take advantage of existing expertise in the integration of tools, like social into business and collaboration applications.
The burden for integrating all of this is often on the backs of already strained IT departments. Often they do not have the bandwidth to do more than just slap applications together as best they can, often far slower than many enterprises need to keep up with their competition. The extra personnel requirements and level of expertise can be expensive as well.
Reaching out to an offshore development team, particularly those that are expert in the area of application and system integration can both aid in temporarily beefing up your staff, and keep the cost down as a result of the lower cost of development offshore. The teams often have well structure methodologies for evaluating your business needs, existing applications or systems, and then formulating the most cost effective way to integrate social into your business.
Between Yammer, the recently acquired private social networking platform, and providing further email integration features like participating in discussions via email, SharePoint user adoption seems to be one of the big benefits. Add to that the further ability to integrate with Office 365 at multiple points and the idea of having difficulty driving end-user adoption should be a thing of the past.
Much of the adoption boost is simply the result of giving end-users more software points that they are already familiar with and understand. The private social network gives the collaboration a more familiar feel and aids in boosting the comfort level of the user.
However, in order for this to succeed you need to see SharePoint increasingly running in the background allowing the more familiar access points to drive the experience and make things simpler. Although most adopters of SharePoint still value its power and usefulness, it hasn’t always been the easiest for end-users to learn and maximize their benefit from it.
Yammer, which is essentially a Facebook clone, gives private social networks a familiar interface and it does what they expect a social network to do. The same is true of other applications that can now be integrated into SharePoint. However the bottom-line according to experts is to give users something easier than what preceded it.
With so many options to choose from, selecting the right ones for your integration strategy is critical. If implemented properly and rolled out by experts you will require less end-user training and the learning curve for them will be significantly shorter than previous versions of SharePoint. Many firms with development capabilities in this arena will also have end-user training services to further aid you in driving user adoption and build greater return on your system investments.
So if you are building out an existing collaboration environment or implementing your first, taking a look at how Microsoft has made integrating these systems possible is going to be critical to your ultimate success.
According to Raj Jain, CEO of Intalio, speaking of the company’s release of its new portlet for SharePoint Web Parts, “SharePoint users looking for an enterprise-level business process management platform to automate their mission-critical complex processes had few choices until now. We are thrilled to combine our years of BPM experience with our partner’s SharePoint skills to offer a highly-competitive, proven, best-value solution. With Intalio|bpms and SharePoint, administrators and end-users alike experience the best of both products: world-class BPM on their trusted application platform,”.
Intalio is a leading provider of application development tools, business process management (BPM) software and its “Jetty://” application server. In a partnership with JPL Informatique SA the two companies joined forces to develop the portlet and provide for its integration into SharePoint.
- Task List: End-users access task lists, notifications lists, and processes from within SharePoint. No need to launch an external application.
- Process List: End-users and administrators initiate workflows from SharePoint, making the most of their familiar platform environment.
- SharePoint Native Forms: Automatic conversion of forms designed in Intalio|bpms to native SharePoint forms provides a solid homogeneous user experience to the end-users throughout all applications.
- Task Management: End-users claim, revoke, reassign, export, and skip tasks within the portal. Task management is comfortable and familiar, and therefore more effective.
- Task search: Real-time search through the task list based on task metadata makes the most of SharePoint platform functions.
- Multilingual: Pre-bundled with English and French support with additional languages available for best-in-class BPM, worldwide.
- SharePoint DMS integration: Workflow documents are stored automatically in SharePoint’s document library, easily navigable and accessible from the user interface — and easier to use.
- Notification Bar: Notifies users of pending tasks and notifications within SharePoint as part of the fully integrated features and interface.
The portlet and Intalio’s suite of developer tools will allow development companies working with SharePoint clients to integrate a powerful BPM into the enterprise collaboration platform. As internal customers clamor for more integration between the planning tools they now depend on, companies will be well served to consider this one.
For companies who lack the internal expertise to implement new tools like the Intalio/BPM portlet or require other system and database integration with SharePoint, an outsourced team can often save you both time and money. Whether working onsite or remotely many of these teams are located offshore and can provide development services at costs far lower than even an internal team.
SharePoint continues to lead the collaboration market and this new tool will only strengthen its usefulness to organizations of all sizes and missions.
There are good arguments on both sides. HTML5/Java is faster to develop for more devices, they are tried and true methods, and most developers have the skill needed to work with them. Native apps have less latency, contain libraries, have device management capabilities and easily provide the user with access to device features from within an application.
The use of mobile applications for business purposes is however on the rise. Increasingly mobile work-forces, the availability of business applications in hybrid or private “clouds”, and the ongoing integration of enterprise level business systems are driving the need.
The choice of whether you develop your business app in a native device method or HTML5/Java is not one to take lightly.
Native apps are labor intensive and costly. Multiple versions of the application have to be developed, coded, tested and deployed. They work a bit faster because more of the code runs from the device libraries not from the Internet.
From healthcare to manufacturing and logistics, mobile application use is on the rise. How you do it affects your return on that investment, so it is a critical decision. Choose wisely, and give your competition a “run for its money”.
From manufacturing to healthcare to just about every business that involves any sort of process is looking to automate all or part of their operation. Manufacturing has been moving this way for over twenty years but business process automation is still in its infancy as companies seek to automate, monitor and improve overall process performance.
New tools for automating routine medical data entry have begun to arrive and they tout client savings, time, and the alleviation of data inaccuracies. For example, an off-the-shelf product recently announced by Oleen Pinnacle and Network Automation will do everything from “creating and loading claims test scripts to adding or updating provider records, benefit plans, member information and contract agreement rates.”, according the press release.
However, more often businesses are attempting to design and then implement automation on their own knowing that there are real bottom-line impacts from process automation. They want make their business processes less labor intensive and streamlined to keep them competitive in a technology driven economy.
There are at least a couple steps you need to consider prior to embarking on process automation in your business.
In observing the roll-out of an e-Litigation system for Singapore’s Supreme Court, technology writer Imelda Tan points out that you should “reduce or simplify processes before you automate them,…”. She also cautions that often, in order to save time, companies will simply try to port old processes and habits to some form of automation despite the fact that some may have been outdated.
“Reviewing processes continuously may incur more man-hours but can potentially save more man-hours in the longer term.” Tan writes, “Ask yourself if a process can be cut? Some processes are still around for legacy reasons. If no one in the company knows the rationale for their existence, eliminate them temporarily and assess the impact. If no one notices, then they may actually be obsolete.”
In a similar way Alexander Mehlhorn, CEO of Framework One, recommends that you “merge your business analysis with systems thinking” to gain the best automation results. His recommendation is to be sure to hire an expert that has sound business process analysis skills, “Finding a business analyst that understands your particular business requirements while also having strong systems thinking abilities is a rare, often expensive skill, and the reserve of larger companies. Smaller organizations will do well to team up with an experienced partner that can give them the benefits of an in-house resource without the high costs,” Mehlhorn explains.
It is this last aspect that can be aided by seeking experts from offshore companies that have both the expertise needed and offer that expertise at a lower cost than onshore process analysis teams. Development teams from companies who have diversified their offerings, technologies served and business process understanding are often a lower cost method for companies to create the strategy, aid in reducing unnecessary processes and then move the resulting processes to an automated platform.
If the U.S. and global economy is to recover fully process automation will be a critical part of that recovery. Companies seeking to automate more of their business processes should heed Mehlhorn’s advice and seek the necessary expertise before you spend more time, energy and money than you need to.
According to Enterprise Architect Scott Robinson in a recent TechTarget article the SharePoint libraries included in the 2013 platform “…frees users from dependence on IT while urging them toward true collaboration.”
He believes this is an essential change and one that will, if properly implemented, make the SharePoint experience not only better but more likely to encourage adoption and use. Yet he warns you not to migrate your existing folder hierarchy. In fact he advises you to keep the libraries as flat and open as possible.
He explains how the legacy folder hierarchies now mess with SP’s structural integrity. “Few things are more destructive to SharePoint’s structure. Why? First, a folder hierarchy encodes metadata (how the different folders relate to one another) in a logical arrangement that is static and often opaque. Second, nested folders wreak havoc on SharePoint’s efficiency vis-à-vis SQL Server. Furthermore, folders add length to the URLs specifying the ultimate residence of your data — those URLs conk out at around 260 characters.”
SharePoint 2013 has built-in a number of features and enhancements but for all of them to work, everything needs to be tagged correctly and consistently. This is something that requires a great deal of institutional discipline and often internal training is required.
The initial migration, if you are upgrading from a previous version of SP, needs a great deal of expertise and care during migration and database implementation. Often internal personnel are not “up-to-speed” enough to both migrate the data and ensure that it is set up and tagged properly.
Outsourcing a portion or all of a SharePoint upgrade and initial setup is an option that may save you time, money and many sleepless nights. A number of resources exist in the form of SharePoint experts and database/data warehouse developers that will get the job done at a cost often smaller than many companies can do “in-house”.
Regardless of how you deploy SP 2013 be sure to heed Mr. Robinson’s advice. Keeping to the new SharePoint protocols and leaving your old folder setup out of it will be your best bet for a smoother and more productive SP migration and upgrade experience.
And the winner is…hybrid.
Well not completely yet but in many circles it continues to be one way that larger companies are balancing the ease of entry via a public cloud against the greater security and control of a private cloud. In these companies, the hybrid approach gets them in and yet provides at least a portion of their most critical data and applications to be under the direct control of the business. However, hybrid is not the way many early cloud adopters went, preferring to build their own clouds and openly arguing against the public approach.
Don’t think that the public clouds take the criticism lightly, they point to how broadly many in IT’s definition of a cloud is. In fact Amazon Web Services CTO Wener Vogels refers to private clouds as “false clouds” that are designed to get enterprises to buy more hardware.
In a recent TechTarget article, Modern Infrastructure Editor in Chief Alex Barrett observes “Part of private cloud’s problem could be IT’s loosey-goosey interpretation about what it is — and therefore what it brings to the table. Experts define a private cloud as dedicated resources running behind the firewall that are organized into an Infrastructure as a Service (IaaS) cloud computing platform. The National Institute of Standards and Technology (NIST), in turn, says that an IaaS cloud must include five essential characteristics to be a true cloud: on-demand self-service, broad network access, resource pooling, rapid elasticity and measured service.”
Yet those who have deployed private clouds insist they would have it no other way. The fact that private cloud deployments lag behind public in their opinion isn’t necessarily that important. Public clouds serve an important purpose and for some they are a perfect solution for either quickly getting a startup off the ground or expanding IT capabilities without huge CapEx investments.
Yet the private cloud makes sense to those who need cloud services but want to keep them under tighter control, more secure or simply don’t need to serve more than a smaller footprint. It makes some sense to consider your business needs and the geography of your end-users before you decide to deploy a global cloud when you only have a regional footprint.
The use of open source private cloud technologies is growing and there are several, some more mature than others yet many that are starting to be readily adopted for private cloud deployments. Many of these, like marketing automation company Hubspot, have built an internal cloud infrastructure while still running a large percentage of their operation on AWS. It is their plan to eventually migrate all but the emerging or high-growth applications on AWS to their private cloud.
There is at least one other very critical issue for the biggest clouds that may increasingly move businesses to consider only hybrid or private. Privacy concerns, particularly as a result of the federal anti-terrorist program PRISM and the growing concern about AWS’ global footprint leading to cyber threats to personal data. AWS’ size alone makes it a very tempting target for everything from cyber terrorism to simple hacking attempts. Some enterprises view private clouds as simply safer because they can be deployed in facilities and on networks closer to home. These smaller deployments also make for smaller less attractive targets.
The debate is likely to continue but if your company is considering any of the current cloud options you should consider utilizing outside experts to help match your business, its needs and its geographic requirements to the right solution for taking advantage of cloud computing. In many cases these can be offshore firms who have both the expertise and implementation personnel available but at rates hard to find in the U.S and other more developed nations. These are teams that have been cultivated to provide low cost high value technology development services and have well educated folks available at an amazingly low cost.
You may not be ready to deploy your cloud yet but most companies in the future will be using some form or another as the Internet increasingly becomes one big programmable operating system.
Seems that business is starting to grow for the logistics sector thanks to increasing movement of manufacturing sites from Asia to North America and Mexico. Part of what is making the move successful, according to Fleet Owner contributor Sean Kilcarr is “the rise of simple and less expensive cloud-like computing options for managing logistic operations.”
His interview with Tompkins International VP Valerie Bonebrake revealed that the firm’s analysts are not only seeing a growth in cross border trade with Mexico but with Canada as well. With major retailers like Target and Nordstroms making moves into these regions there is increasing need for third party logistics (3PLs) companies to also expand into these markets.
Easing the load on tracking and managing these now far flung and more heavily traveled routes is the availability of cloud-based transportation management systems (TMS) and Bonebrake noted “This capability is now affordable for mid-sized companies [and] while some may choose to adopt the technology, others see more value in pooling their freight to leverage greater buying power,”.
Although there are several types of transportation management systems they are only now beginning to gain a foothold and all will admit that no two logistics companies are alike. This usually means that there will be a certain amount of customization required to meet specific business needs and many times it requires the integration of TMS into other systems like warehouse management or order processing.
In addition, the TMS software can be costly off-the-shelf and the systems that are not as expensive usually have limited the amount of customization that can be easily done just to keep it cheap.
In a guest commentary for Logistics Viewpoints Brian Armieri, Chief Technology Officer at MercuryGate, sums up the space and the delays to adoption, “All software companies want to build products that are feature-rich, defect- free, and easy to maintain. Meanwhile, software users demand a rapid ROI, low-cost implementations, and customizable solutions. Users of transportation management systems are no different. They’ve long been searching for the ability to customize software to work the way they want it to work and look the way they want it to look. Traditionally, customization came at a steep cost. Some modern web-based systems have recently offered the lure of reduced operational cost but sacrificed on the ability to customize.”
The types of customization will vary from firm to firm but they will usually involve front-ends, types of data fields and how they are validated, which business process can or will be enabled by the software, and how they talk to or are integrated with other systems.
For you to really realize all the value of even an “off the shelf” solution you may need to seek the lower cost development services and expertise of an offshore development company. Many have been working with the transportation industry for years and some of the “home grown” systems in use today were actually “grown” by development teams in countries with lower overall rates for such services.
Even customization of existing systems and software can be done very cost effectively and help companies meet the need to rapidly scale up to greet these new emerging opportunities in North American logistics.
To quote Wikipedia; “the Internet of Things (IoT) refers to uniquely identifiable objects and their virtual representations in an Internet-like structure.”
To read the entire entry might make fans of “The Terminator” nervous when they realize that the concept of virtually every object, machine and device, from pacemakers to jumbo jets, is directly linked to the Internet. Not quite a “machines take over the world” scenario yet but it is the dawn of a new age for application development and furthers the reach of “the cloud” into more aspects of both business and personal life.
A concept where all objects and people in daily life are equipped with identifiers was first proposed by Kevin Ashton in 1999. He envisioned that once they were “connected” they could be managed and inventoried by computers. All of it predicated on the availability of, or use of, radio frequency identification (RFID) as connecting everything would for the most part require a wireless connection and a way to identify the object.
According to Helen Duce, a director of the RFID Technology Auto-ID European Centre at the University of Cambridge, the university has created a bold vision of a new RFID-connected world: “We have a clear vision – to create a world where every object – from jumbo jets to sewing needles – is linked to the Internet. Compelling as this vision is, it is only achievable if this system is adopted by everyone everywhere. Success will be nothing less than global adoption.”
However if you ask some in the IT universe the whole thought of having to add IoT to their plate, along with cloud, mobile applications and starting to mine all that “big data”, is one they are not sure we are ready to handle…let alone afford.
In a recent Wired article Mahesh Kumar , CMO of the “Data as a Service” company BDNA, expresses concern as to whether IoT might not be the very thing that sinks IT. He does acknowledge that it is a data problem they face and one that has been building for years, “The shift from mainframe to client/server” Kumar points out, “drove a huge escalation in IT data. Storage networking introduced new layers of data to manage. Server virtualization not only increased the number of virtual servers to be managed, but also added even more data to map physical assets to virtual ones. Today, dynamic applications running on private or hybrid cloud infrastructure fluctuate in real-time, adding further complexity to IT data.”
He goes on to suggest that it is not just volume that is a problem but complexity and, although he doesn’t say it outright in his article, the structured and unstructured nature of many data sets is what really makes managing all that data time and labor intensive. And you have to also add “big data” and the drive to dig into it and glean meaningful, profitable outcomes for businesses.
Sorry but no. IoT has already been on and off the drawing board and has already resulted in the introduction of products to enable the “Internet of Things”.
First there is the introduction of products that harness “ambient backscatter”, which loosely defined is a way for the “objects” that are the target of IoT to hitchhike on virtually any available signal in its proximity. It also is able to draw power from these sources which helps to mitigate the hefty power consumption requirements of wireless devices.
“Ambient backscatter” is a method of networking developed by researchers at the University of Washington. To sum it up, the technology allows devices to communicate with one another wirelessly and with no batteries. Instead of creating their own signals, ambient backscatter devices essentially freeload off existing signals from radio, TV, cellular, and Wi-Fi networks, which invisibly blanket much of the earth. Sounds a little crooked but it frees these devices from reliance on any fixed network access point and removes the battery life problem by not needing one.
Intel has recently introduced a gateway that they refer to as a “middleman” for IoT and IBM and Semtech have also just release a wireless solution aimed at the Internet of Things. These two solutions each target one or more of the basic needs of IoT but don’t necessarily mean it’s here…yet.
Nonetheless, several existing technologies serve to already support using the Internet and specific applications to perform very useful things for a variety of company types. Logistics departments are already seeking GPS and mobile applications and more than one sensor or two have been deployed by businesses in need of monitoring remotely any number of critical assets.
Developing such applications doesn’t need to be expensive and they can be done utilizing offshore development teams already skilled in programming the interactions of remote devices and computing infrastructures. So even if there isn’t an “ambient backscatter” device available, your business can start to use remote technologies to monitor everything from the freezer in your grocery store to a critical pressure setting on an oil pipeline.
Probably not the end of IT but only the beginning of yet another revolution in computing.
Netscope, a cloud application analysis and policy development firm, recently released some analysis of their “Cloud Confidence Index” showing industry sectors that have sufficient “confidence” in the usefulness, security and/or performance of cloud services and applications.
Looking at it from an industry by industry perspective they found that Enterprise Resource Planning (ERP) along with Document Management and Security to be the services having the highest level of confidence among enterprises seeking to adopt cloud services. This could explain why more and more companies are either buying into existing ERP cloud providers or building their own private versions.
Nonetheless, many of these same organizations have already moved their CRM to the cloud and now they are starting to think about taking their ERP systems out from behind the “firewall” and integrating it with their CRM systems in their public, private or hybrid cloud.
Many companies are utilizing systems that were primarily built on older technologies that were never quite as innovative as they could have been. This was because they were developed more to fill in the gaps left by other systems they were using like Oracle, SAP and IBM. As such they often were built on the backs of even older and more outdated tech.
The inflexibility of most of these systems is why customers want to move to the cloud. Unfortunately, the current enterprise resource planning providers are too entrenched in their old systems and afraid to “cannibalize” their base in order to embrace the cloud. Many ERP start-ups are taking advantage of this condition and are creating new levels of disruption in the ERP marketplace.
It is the SMB market that would greatly benefit from moving and integrating ERP, CRM and business intelligence management to the cloud. The cost of entry is starting to lower and dozens of businesses exist that will aid these smaller companies in the integration and movement of these critical business tools to the cloud. By utilizing the outsourced services of a development firm skilled at planning, executing and supporting the integration of services as well as moving them to the cloud, the cost of the move can be greatly reduced.
The big winners in the future will be the companies that can maximize their data across multiple disciplines to meet current and future challenges or opportunities. The cloud holds much promise, but be sure you have the “best of the best” in expertise aiding you. There are many choices now but not all will be right and the new systems can be complex to understand both technically and how they are priced.
Make the move, but be sure you have the right solution before you do.
Even after years of web based travel and booking sites, the travel industry continues to see technology as key to their continued success. From large travel agencies to hotel chains to whole countries and destinations, web and mobile applications and systems are being developed to help further drive travel around the globe.
U.S companies could learn a bit from European and Asian firms whose customers have rapidly grown usage of mobile access to travel websites and applications. With mobile data deployments in some countries growing faster than in the U.S. it is not just a trend, it is the best way to engage both potential and existing customers with services and information.
As Dr. Madanmohan Rao stated, in a preview of his conference presentation for APTHAT in November concerning what he refers to as “tweet tourism”, “The convergence between mobile media and social media — thanks to mobile Internet — is changing the way tourists and the tourism industry are accessing, publishing and sharing information and connections.”
The Amadeus Group, a provider of technology products and services to the travel and hospitality industry, recently issued a report on the key role that tech will play in serving the next generation of travelers. Although older people are no strangers to technology and web travel booking services, the next generation of travelers will likely be spending more time using mobile devices, cloud based services and will be including out of habit social media for everything from advice, to reviews, as well as to deal with customer service issues.
Based on interviews with “adventurous and exploration oriented travelers” between the ages of 18-30 across a number of countries, the report sites three key themes that emerged from these interviews. The first indicated that these travelers, and those that will follow, are digital natives and Internet experts when it comes to useful resources. Their expectation will be 24/7 access and will do most of their trip shopping online. They also expect that all of these resources will move seamlessly from smart phone to pad to PC.
The report also sited a shift in next gen travelers to be more “explorers” rather “tourists”. This will obviously require that they have tools available that make their exploration of the destinations they choose to be device and application centric. Add to that the fact that “social” has created a generation that where socializing is “second nature” and the desire to “share” their travel experience means both travel companies and destinations will need to accommodate this need to share the experience within multiple social platforms.
It’s a new world.
U.S. companies will need to stay ahead of the technology curve if they are to take advantage of both current and next generation travelers. Whether a small travel agency or a large airline, how well you develop mobile applications and web resources will define your success in the future.
The problem in many cases is the cost of developing these resources. For some U.S. companies, and companies across the Americas, seeking the lower cost skill and expertise of offshore development companies may be the solution for keeping up. These are the same companies bringing innovation and new product development to enterprises all over the globe at hourly rates that are just not possible in the U.S. Many of these firms have established U.S. based operations and utilize local and regionally based personnel to manage projects for U.S. customers.
Working with these types of development companies requires superior communication and project management skills and the U.S. based operations help to maintain the necessary interaction between customer and supplier, keeping projects on track and on budget. Their offshore counterparts provide technical and programming expertise that is more affordable than similar services from U.S. based development teams. The combination gives even small business assets they couldn’t afford otherwise and ensures that your projects have a higher return on your investment.
Any way you slice it, the travel and hospitality industry is on a tech binge and travelers are going to benefit. How your company fits into this new model may make the difference between success and failure.
It has been many months since Salesforce.com acquired leading email service provider turned cloud innovator ExactTarget, but the integration of the two clouds is already becoming a reality.
Marketing departments, the primary users of the email service and the initial adopters of the multiple marketing automation service and analysis engine, are already giving the union a thumbs up. Many of these have been using Salesforce and its cloud, while also managing a separate use of ExactTarget. Salesforce.com has never been the source of a true mass email service, limiting users to 1000 a day from its platform. But true integration of these two services should bring much needed new capabilities to Salesforce.
Although Salesforce is often the domain of sales with marketing support, increasingly the groups are becoming more integrated with each other, as lead generation, nurturing and sales cultivation of opportunities becomes increasingly a synergistic effort.
Scott Dorsey, CEO of Salesforce.com company ExactTarget and Salesforce Marketing Cloud, announced the debut of the ExactTarget Marketing Cloud, a marketing platform that combines ExactTarget’s email, mobile, web, marketing automation, data and analysis, and content capabilities with Salesforce’s social media offerings: Buddy Media, Radian6, and Social.com. The announcement occurred during Dorsey’s keynote at ExactTarget’s Connections ’13 conference in Indianapolis, Indiana.
“As the world becomes more connected, marketers have an unprecedented opportunity to lead transformation in business by connecting their companies to their customers in entirely new ways,” said Dorsey.
Salesforce, in making its $2.5 billion purchase of ExactTarget, enlarged its social marketing solutions with the new acquisition’s marketing automation and campaign management tools. The roll-out of new email marketing capabilities strengthens one of marketers’ most dependable — and relatively controllable — tools. The use of predictive analysis, based on customer behavior and preferences, is based on technology that ExactTarget acquired when it bought iGoDigital in 2012.
With these new additions to Salesforce it becomes both more powerful and more complex than ever. It also opens up new possibilities for applications tailored specifically to your business needs. Engaging outsourced experts who are skilled in developing applications for use in the Salesforce cloud can provide you with new twists specifically designed to benefit your company’s use of the cloud giant’s service. Many offshore development teams have been working with Force.com and ExactTarget for years, providing you with a lower cost avenue to tailor these new services to your needs.
All and all, the addition of ExactTarget to the Salesforce.com set of assets shows clearly that they are marching full speed ahead to build the most complete set of CRM tools available in the cloud.
According to Automattic founder and president, Matt Mullenweg, WordPress should be powering the majority of all websites. Tiger Global Management thinks so too. It’s only been months since they paid $50 million for a secondary stake in the WordPress.com maker Automattic, and they just threw in another $60 million in shares purchased from Polaris Partners. The investment management firm clearly has a similar goal in mind for WordPress and Mullenweg is determined to grow Automattic and grow it fast.
“Their [Tiger’s] deep resources, market experience, and long-term outlook make them an ideal partner for the next phase of Automattic and the continued growth of the WordPress ecosystem,” Mullenweg wrote recently in a blog post. “What we’re building will take time and it won’t be easy, but things worth doing seldom are.”
Mobile factors heavily in the long term plans of WordPress, just as it does for Joomla! and Drupal, but out of all the content management systems (145 and counting) the big WP has 58.3% of the CMS market trailed dramatically by Joomla! (9.5%) and Drupal (5.8%). Everyone else falls well below and increasingly it looks like WordPress will grow just as the development communities that support it have grown.
As this is a tall order and given that over 65% of all websites are either using a “home grown” CMS or no CMS at all, there seems to be room for several top platforms to grow. However, despite how powerful WordPress.com is, many businesses find that meeting their needs requires functions or capabilities better met by other systems. The most difficult part for organizations deciding on which platform to use, is sorting through them with a clear knowledge of how to match the right platform with their business needs.
Even after deploying your system of choice, the popularity of the top content management systems has also served up a target hard for the hacking community to resist. As such the top two are also the ones with the greatest security risks. Help Net Security reported a recently discovered backdoor with brute-forcing capabilities that searches the Internet for WordPress and Joomla! websites with weak administrative passwords in order to turn them into malware and phishing sites. It is the sheer number of available targets that causes hackers to favor attacking the top couple content management systems more than those with fewer deployments.
When it comes time for you to build your next web, or upgrade your old one, it pays to employ development and management teams well versed in the capabilities, as well as, the issues facing these new engines of web commerce.
Outsourcing the development and/or management of systems like WP to well established offshore firms can often provide you with the right kind of knowledge to get the most out of your web, while keeping it secure. They can offer services at far lower rates than U.S. based development companies, and employ developers who have benefited from intense technology focused educations. Several of these firms have established U.S. based project management teams that keep communications clear, while still taking advantage of lower offshore development prices.
The deep experience these firms bring in the areas of superior functionality and security offer your organizations a way to sort through the options, make sound business decisions on what to deploy and provide cost effective management of highly secure webs.
But if you wish to choose yourself, choose wisely. With 145 choices (see them here), a outside expert might just save you a sleepless night or two.
HANA is SAP’s flagship product and it integrates database, data processing and application platform capabilities in-memory. Ellison was hinting about the announcement next week concerning the addition of an in-memory option for the Oracle database, a direct response to the success that SAP has had with HANA.
According to SAP, HANA has been its fastest growing, most successful product ever and Oracle is not sitting idly by as it brings an almost identical feature to its Oracle Database product. As Ellison puts it, “Virtually every existing application that runs on top of the Oracle database will run dramatically faster by simply turning on the new in-memory feature. Our customers don’t have to make any changes to their applications whatsoever; they simply flip on the in-memory switch, and the Oracle database immediately starts scanning data at a rate of billions or tens of billions of rows per second.”
SAP isn’t sitting idle either. SAP and Hortonworks recently entered into a resale agreement allowing SAP to resell the company’s Hortonworks Data Platform (HDP). According SAP, this allows them to offer their users a complete enterprise architecture that leverages what they claim is the industry’s first 100% open-source data platform powered by Apache Hadoop. Hadoop is a powerful open-source software that aids in finding business insights from huge amounts of structured and unstructured data quickly and at a lower cost than they may be experiencing today.
Not dull on the marketing front either, with SAP and communications giant AT&T joining forces to allow AT&T to broadly offer the company’s suite of mobile app creation, security and mobile management software to its customers.
SAP has far to go to really threaten Oracle’s customer base but they are making progress.
This may be why on September 19th of this year Gartner recognized SAP as a “Leader in Industry Analyst Firm’s 2013 Single-Instance ERP Magic Quadrant for Product-Centric Mid-market Companies” for the third year in a row. Their “Business All-In-One”, which runs on SAP HANA as well as SAP’s Sysbase Adaptive Server Enterprise, is deployed both on premise and through a private cloud offering.
If your company has decided to implement or already uses SAP, the growing number of application opportunities, better business intelligence through data and need to meet increasingly critical business demands may make it wise to consult with experts on how best to incorporate these tools into your business. There are a number of offshore firms that can provide you with lower cost expertise, as well as, strong track records in implementing and developing for both SAP and Oracle.
By leveraging these offshore experts, and adapting these platforms to meet current and future business challenges, you will get the most out of your investment in these powerful platforms.
After a decade’s long career as a pioneer in software development Dan Bricklin isn’t done yet. Now the CTO for Alpha Software Corporation, Bricklin is providing leadership in the development space. He is also advocating the use of JAVA for mobile development over native code.
In a recent article for Infoworld Bricklin goes to great lengths to explain why, despite the fact that certain specific native code applications may run faster than one created in Java, there are many more where Java simply runs circles around native code.
Speed is “Top of Mind” for Bricklin.
Although he champions Java for mobile, the Infoworld piece goes into far more detail than just that and shows that there will be a balance between native code, Java, advances in browser functionality and the use of HTML5 and CSS3.
Speeding things up is on the top of Dan’s mind and he sees both old ways and new to approach the problem, “It’s long been known in programming that different languages for the same algorithm can result in small performance gains of perhaps five to 10 times, but that improved algorithms give you orders of magnitude improvement in the range of 10 to 100. That is, if you can approach a problem in a different way or rethink it after you’ve coded it and have seen what is really needed, you can often get enormous performance gains versus just running the same operations somewhat faster. The history of algorithms for sorting is a simple, powerful example.”
Whether your organization is being driven to provide more mobile applications for your work force or need legacy software accessible via a mobile application, deciding what to develop in is your most fundamental decision. Many companies with legacy systems or
lacking the internal expertise of mobile savvy developers might benefit from employing an outsourced solution.
Outsourcing this sort of development to third party teams who provide well established experts in native code, Java and understand the advances in browsers and mobile app environments is an important option to consider. This can speed development time, lower costs and conserve internal resources for more critical projects. The growing advances in HTML5 and the continuing evolution of CSS3 make keeping up difficult for more focused development teams. Seeking outsourced experts in these areas and for testing services post development are also a good way to ensure your choices have been good ones.
As U.S. healthcare organizations begin to incorporate the new business model that the “Affordable Care Act” creates for them, the idea of delivering health services and information by way of mobile applications is gaining momentum. Physicians are already signaling they would welcome mobile Electronic Health Record (EHR) access.
In a recent survey of physicians by Black Book Rankings, 83% of the doctors polled would welcome mobile access to these records should it become available. Considering that Manhattan Research’s survey discovered that 72% of physicians now use Apple iPads in their everyday practice, that clearly signals that increasingly, healthcare professionals will be depending on mobile apps for delivering their services.
Mobile EHR apps are difficult to build securely outside of IOS for now but they are only the tip of the iceberg of the mobile application boom about to hit healthcare. According to Motley Fool technical writer Leo Sun, “Unlike mobile apps from retailers, EHR companies cannot simply repackage a website as an app and release it. Mobile EHR apps must be built from the ground up since tablets are not optimized to include all the powerful features that the desktop version has. Tablets have smaller displays and less processing power, rely on slower wireless connections, and require optimized graphical user interface designs for touch screens.”
He goes on to note, “The iPad’s greatest advantage over Android competitors is that each generation has identical hardware and software, meaning that apps can be easily developed and tested for the platform. By comparison, the Android tablet market is fragmented, with multiple vendors creating a plethora of hardware combinations — making it difficult for app developers to create a single app that works flawlessly across all platforms and configurations.” As such most development has been to create a “native app” or one designed specifically for the iPad’s OS.
You can find a few out there already that have built an audience. WebMD, Heart Rate, Glucose Buddy, and Find Me Gluten Free are just some apps that are already on app store shelves. These are designed to do everything from helping to diagnose a symptom you are experiencing to actually using the phone’s camera to calculate your heart rate.
However more large hospital systems are looking to push out to their patients pro-active health tools, applications to allow for remote diagnosis, and even apps that allow mobile devices to provide remote patient monitoring for outpatient applications.
“Patients sharing data about how they feel, the type of treatments they’re using, and how well they’re working is a new behavior,” says Malay Gandhi, chief strategy officer of Rock Health, a San Francisco incubator for health-care start-ups. “If you can get consumers to engage in their health for 15 to 30 minutes a day, there’s the largest opportunity in digital health care.”
Smaller practices will need to join in as even small practices can benefit from mobile solutions. Yet they often have fewer resources for developing specialized mobile applications. For these organizations, looking for outsourced development services may be an answer to keeping their practices competitive with larger health organizations.
Application development of patient and doctor tools, even integration with existing practice computing systems, can be accomplished by an experienced software development team. Many will benefit from utilizing offshore teams in countries where the hourly cost of development is significantly lower yet the level of expertise is in some cases higher.
Regardless of how various healthcare organizations take up the challenge of using technology as a tool to lower patient costs, mobile applications will be a part of it. Providers of healthcare have the lead now and the pressure is on for them to provide better services for less. Mobile apps will be fundamental to this effort and something your practice might need to start using sooner rather than later.
Market Wired reported in late August that MoTwin, a mobile platform provider for business applications, had outlined what they consider the key problems enterprises face in crossing the “mobile cliff”. The mobile cliff is a metaphor for the difficulty in meeting the back-end/cloud infrastructure integration, providing real-time performance and the best possible mobile user experience. While primarily promoting their application platform and the white paper they commissioned from Peter Crocker, founder and principal of Smith’s Point Analytics, but the article and the white paper do sight real fundamental needs that must be met when deploying formerly “on premise” business systems to mobile integration.
According to Mr. Crocker “The mobile enterprise market has not yet stabilized on a platform or methodology, leading many companies to try multiple times to get a mobile application that satisfies both the user expectations and enterprise requirements,” He went on to say, “While there are many open source — free — framework and tools that address some of the front-end mobile design, the back end remains the critical piece of enterprise mobile today. At a cost of up to $500,000 dollars per mobile implementation, not to mention the loss of business costs, a ‘mobile-right’ strategy, that includes back end integration, real-time updates and context-aware functionality, is required.”
Still expert development teams will be necessary.
Crossing the “mobile cliff” may soon become critical for a number of industries, none more obvious than retail sales. In a recent ZDNet article technology sector observer Tim Lohman provides a few examples of retail sales use of mobile apps as a sales channel.
One retail application he points to is ticket sales. Quoting Greg Fahy, Head of Product for Australian ticket retailer Ticketek, after the company completed a two and half year mobile app push based on consumer demand, “Mobile is huge for us,” he said. “For our online traffic, just over 30 percent are on mobile devices and for some of our content — One Direction or Justin Bieber — 45 percent of our tickets would sell on mobile, so it is a really important area of growth for us in the coming years.”
Given the clear need to get started now, if you haven’t already, to consider and develop mobile retail applications for your business you may want to consider outsourcing the development of your retail mobile applications. Considering the need for both mobile development and the integration of multiple back office systems, going mobile can get expensive.
One way to avoid the high cost is to seek offshore outsourced development teams, which often work at a fraction of the cost of U.S. based developers. U.S. located project management teams often are involved in these offshore development projects to aid in keeping things flowing and providing for clear communications between the offshore developers and clients. Many of these companies have been established specifically for the purpose of augmenting existing internal capabilities or to provide short term expertise that would be too expensive in a long term hire.
Either way bridging the mobile cliff is now an imperative for more and more retailers.
Oracle, in its continued effort to move from a premise based software provider to one more in step with current technologies, released recently its Oracle BI Mobile App Designer. The app designer allows business developers to create interactive analysis applications for use with mobile devices.
The app authoring system features a straightforward, browser-based, drag-and-drop interface that allows developers to assemble data, text, media, visual charts, and tables from a multitude of sources and provide apps that are customized to their business needs. Companies that wish to put their Oracle applications into the hands of a very mobile workforce have the ability to do so now with the new design system.
Although Oracle touts the system as being as easy to use as any other business software some companies may wish to utilize outsourced developers. These development teams have experience in mobile app development on multiple systems and the experience necessary for developing applications for key Oracle products like WebLogic and for integrating Oracle mobile apps into other enterprise BI tools.
Both Oracle and Microsoft are in a race to bring their BI applications, systems and clouds into a vastly more mobile and data driven businesses. According to Michael Tejedor, Senior Product Marketing Manager at Microsoft, the firm is “super focused on giving customers the ability to connect to the broad array of data out there”.
Smaller companies will soon be able to take advantage of tools affordable until now by only the largest of companies and using lower cost, outsourced development teams take advantage of powerful new tools to help these organizations compete without dramatically raising either internal staff costs or straining already strained internal development teams.
Bring your own device (BYOD) has skyrocketed in companies large and small, placing increasing pressure on internal IT staff to provide mobile access to business tools employees use every day. However security continues to be a concern that IT departments must consider when deploying applications and access to employee mobile devices.
Salesfoce.com, arguably one of the largest and most developed of the CRM clouds isn’t taking that concern lightly with the release of its mobile SDK. According to Java World that’s why the new SDK includes compatibility with Good’s containerization technology. According to Salesforce the inclusion of this security technology will make their applications safer and more easily managed.
According to Good’s CEO Christy Wyatt, “With cyber security and privacy being top of mind for every CIO, customers today are looking for a secure mobility platform that protects their business data without impacting the end-user experience,” and the addition of the technology in the 2.0 release “Now, developers charged with building custom apps for Salesforce can leverage the same secure framework trusted by the world’s leading banks, federal agencies and regulated industries.”
Still companies may seek to employ experts in both Salesforce app development and in exploitation of all of Good’s security technology when it comes time to build or deploy newly built mobile applications. Many outsourced development firms have years of deep experience in both the Salesforce platform and mobile security. Offshore development firms can often offset internal experience shortfalls and provide an affordable path to the deployment of critical mobile applications for your business.
One thing is for sure, taking Salesforce on the road is now more secure and signals once again that mobility and the cloud are here to stay.
Only a couple of months ago Oracle’s Larry Ellison was proclaiming that the “Oracle Public Cloud” would be “the most comprehensive cloud on the planet Earth,”. He was explaining the seven year long process that Oracle had undertaken to rebuild all of its applications for use in the cloud. At that time, early June this year, he was building on the announcement last October concerning the software giant’s launch of Oracle’s cloud which contained the company’s Fusion Applications offered as software as a service (SaaS).
Ellison has long been a critic of the “cloud” because in his opinion it was mostly hyping existing technologies and concepts, some of which he originated years ago, brought to life by improved technologies and more widely accessible broadband networks. Yet when he made these statements in June he failed to mention that they were negotiating with Microsoft to have these applications certified to run on the Azure cloud.
Amazon Web Services already resells Oracle technologies on its cloud platform and according to James Staten, a principle analyst at Forrester Research, the Oracle cloud is weak and getting weaker. Although Oracle isn’t giving up their cloud just yet they clearly recognize the need to get their software into the cloud, one way or another.
Oracle apps run better on the Oracle Cloud?
Regardless of where they end up Oracle still can lay the claim that their applications will work better in an Oracle cloud. But companies wishing to use Oracle applications but do so within a different cloud platform (AWS, Azure, Rackspace, etc.) might do well to employ the services of a company experienced with adapting applications to a variety of cloud platforms.
Outsourcing this level of expertise to application and software development teams can be accomplished via both onshore and offshore companies, but the cost of “upgrading” your development by outsourcing parts of it can be very low when using offshore assets. Ellison’s drive to get Oracle products into the cloud can be an opportunity for companies provided they have the right expertise to guide them.
It’s clear by now that there is a hungry audience for applications developed for use on their mobile devices. The hunger extends now well into business and mobile savvy employees are clamoring for apps that extend to business use. Already apps and mobile accessories are taking credit cards regardless of where the transaction happens, business folks want to have similar freedom to access other forms of business transaction using their own mobile device.
Mobile application development and the platforms that support them are maturing and enterprises of all sizes and industries are putting strain on IT departments everywhere. As Charlotte Dunlap, an analyst at Current Analysis, Inc., puts it “Backend integration is the biggest headache for enterprise developers. They are under tremendous pressure to provide the workforce with these mobile apps right now and they have to unlock those legacy systems.”
Legacy systems that are not always easy to integrate and most depend on APIs to bridge the legacy system to provide mobile functionality in a mobile app. As Michael Facemire, a senior analyst at Forrester, Inc., a research firm based in Cambridge, Mass puts it “The Holy Grail is mobilizing the enterprise and adding the ability to manage APIs on top of development tools”.
Amazon Web Services thinks so too.
They recently introduced a cross-platform notification service so that mobile apps can proactively keep their users aware of critical events and relevant information. Their introduction of “Mobile Push” a managed push navigation service utilizes on simple API so that application developers can easily send notifications to Apple iOS, Google Android and Kindle Fire devices.
Yet another example of a major cloud provider pushing the development community to produce more innovative apps, Amazon is letting all AWS customers use Mobile Push for free at first allowing them a million notifications a month at no cost. Above that and they charge you $1.00 per million push messages. “Many customers tell us they build and maintain their own mobile push services, even though they find this approach expensive, complex and error-prone,” said Raju Gulabani, Vice President of Database Services, AWS. “Amazon SNS with Mobile Push takes these concerns off the table with one simple cross-platform API, a flat low price and a free tier that means many customers won’t pay anything until their applications achieve scale.”
This is good for enterprises wishing to build useful applications for mobile business use but the pressure to do it all is still difficult for most IT departments. To meet both the need for attacking legacy system integration while also developing the critical applications your business is demanding you might do well to outsource some parts of both projects. Few businesses can afford to maintain the kind of expertise needed to rebuild old systems to meet new needs and keep a stable of application developers on hand too. Outsourcing the application development to well managed teams of third party developers can allow you to focus your resources on legacy integration or vice versa.
Either way mobile apps are here to stay and their impact on business will only put more pressure on internal IT departments. Seeking outsourced solutions for ‘back-filling” your IT workforce might just be the way to successfully meeting both needs.
Oracle’s new BI Mobile App Designer recently unveiled by the business intelligence giant runs in a browser and has a drag-and-drop design format. This allows end- users to mash up graphs, tables and other types of data to, ask Oracle puts it, “create mobile analytical apps tailor-made to their lines of business.”
The use of HTML5 allows the apps to run across iOS, Android and Windows based devices with users able to share apps via an App Library catalog. Part of the larger Business Foundation Suite and also included in the BI Mobile option for their Enterprise Edition it is being positioned as a self-service product.
As Forrester Research analyst Boris Evelson observes, “‘The self-service’ term has many interpretations, it’s not just yes or no, there are many shades of gray.” He cautioned that “self-service” has a number of concerns to consider.
He reminds that “‘intuitive and ‘user friendly’ are subjective terms” in a Forrester Report on self-service BI released last year, “A point-and-click and drag-and-drop graphical user interface (GUI) may be a nirvana of intuitiveness to an information management pro who started his computer career working with punch cards or green-screen terminals, but to a younger generation of knowledge workers brought up on search GUI from Google and social media GUI from Facebook, a point- and-click GUI may not be as obvious or natural.”
Paul Rodwick, Oracle’s Vice President of Product Management however states that “Purpose-built mobile analytical apps, created and used by business people every day, greatly expand the opportunities for companies to deploy analytics broadly to everyone, everywhere. The new Oracle BI mobile capability enables business users to deliver powerful analytic apps self-sufficiently, for use on any mobile device, to improve business outcomes and ensure everyone is effective while on the go.”
Businesses will still need help to take advantage on these new tools.
There are a number of organizations both in the U.S. and offshore that have years of experience developing in the Oracle and Microsoft Business Intelligence arena. The addition of these new tools simply adds to their arsenal of BI solutions and speeds their development cycles to your advantage. Companies who either lack the bandwidth or the in-house staff to focus on developing new applications may wish to seek the help of an offshore development team.
In many cases these outsourced additions to your “arsenal” will let you take full advantage of the advances, largely driven by the need to mine “Big Data”, and save your company both dollars and conserve internal resources for more “mission critical” projects.
Further staking its claim as a major cloud provider Salesforce.com is, as Simon Bisson of CiteWorld put it, “encouraging an app renaissance” with the release of new design templates, API integrations and a set of sample applications. With around 50% of its 1.2 billion daily transactions coming through applications using its APIs, the SaaS pioneer is providing tooling to help developers build more compelling – and more user focused – applications.
During the release of these new development tools Salesforce’s Adam Seligman, VP of Developer Relations, pointed out that developers have to move beyond focusing on code to consider the design, delivering as he puts it “good looking apps that work with enterprise data.” He went on to note that “it’s very simple, very easy to use; and all of them [the template set] can be styled with CSS.”
A heavy emphasis on providing more tools for mobile the set includes Developer Mobile Templates, 22 CSS and HTML templates that help developers focus on common business scenarios. Part of this impressive release of new development tools targets C# and .NET developers is the addition of the Xamarin Pack to support mobile development in these languages.
Based on the Mono framework, Salesforce.com’s new Xamarin Mobile Pack is designed to let C# and .NET developers build native iOS and Android apps and integrate them into the suite of Salesforce Platform Mobile Services. “We’re going to turn on millions of C# developers to start building iPad and iOS and Android Apps working with Salesforce data,” Seligman said.
Joining Xamarin in these new “mobile packs” are Knockout.js, Appery.io, and Sencha. These templates are hoped to enable developers to build apps using the tools and frameworks they know, to connect to Salesforce business data. All helping to create new business applications that integrate properly into the Salesforce.com application layer and expand the power of an already powerful cloud.
Now businesses have even more reason to expand Salesforce to meet new challenges.
For some taking advantage of these new tools will be difficult or even expensive utilizing in-house or more locally based external development teams. However, utilizing an outsourced team who is specialized in Salesforce development may be the answer for many companies. Utilizing lower cost code developers to create business specific applications can and will save you both dollars and time.
One thing is for sure, Salesforce combined with strategic mobile business applications is one way companies can win in a technology driven marketplace.
Salesforce.com doesn’t just want you to buy apps…they want you to build them yourself.
Cloud pioneer Salesforce.com, perhaps one of the first companies to offer an “app store” for their sales and marketing set of services, looks at app development as a way to strengthen both their customer base loyalty but also to continue to add more value to the services they already provide.
The reason why companies want to build and run their own private app stores for staff is clearly being driven by SF for good reason. In a recent LinkedIn Influencer article Olaf Swantee, CEO of EE who operates T-Mobile and Orange mobile services in the UK, wrote “Fuelled by an increasingly mobile workforce, the rise of bring-your-own-device (BYOD), faster mobile connectivity through 4G, and cloud computing, IT departments are using the app store model as a way of providing access for employees to critical business applications from anywhere, any device and anytime.”
He also noted that “Many already use commercial enterprise app stores such as pioneer Salesforce.com’s AppExchange, which has more than 1,800 customer service and business apps available. Other enterprise software vendors, such as SAP, have also set up app stores for customers.”
He also pointed out that organizations are also “increasingly building their own internal enterprise app stores as a way to manage corporate-sanctioned apps for employees to use on PCs and mobile devices.” Often many of these are developed specifically to provide additional functionality, mobility or data acquisition capabilities to Salesforce.com so that the time honor solution fits better with their business needs.
Even other “clouds” are developing apps to integrate with Salesforce.com.
Just recently the cloud storage company developed an app for SF’s AppExchange that delivers the full functionality of Box within SF’s CRM suite. According to a recent Info World report the application was built using the Box Embed HTML5 framework and provides Box collaboration features like editing, commenting and task management within a Salesforce record, according to Aaron Levie, Box’s CEO and co-founder.
However application development, whether it is for Salesforce.com or a mobile app to integrate with it can be an expensive proposition. For some companies considering an outsourced solution for application development may be the solution. Working with U.S. based project managers and employing one of the many development talent pools available offshore even companies with smaller budgets can have specialized, business critical applications built at a fraction of the cost of in-house personnel.
Many of these companies have taken great pains to build organizations that are well educated and highly experienced at developing SF applications, along with many other development project types, and offer not only lower cost but in some cases superior results. Salesforce has proven itself as one of the primary leaders in the cloud and regardless of size your company owes itself a hard look at how it can leverage these new services to fuel growth and drive new revenue.
One thing is for sure…the cloud has started to make some strange bedfellows.
The cloud is a metaphor really for an intricate set of elements, operating systems, virtualization tools and of course data centers, high-speed networks and security software. Many pundits argue about just how significant the “cloud” is and its future but if the recent actions by Oracle are any indication it is that there is opportunity yet to be discovered.
They must think that if, as recently reported in The Wall Street Journal, they are willing to actively seek out and partner with two of their former foes in order to ensure that their individual products work well together within or as part of a cloud. In the article James Staten, an analyst with Forrester Research, said “the alliance should help Microsoft’s Azure compete better with rival cloud services, such as those offered by Amazon.com and also help make Oracle’s offerings more attractive in comparison to so-called “open-source” alternatives that are widely used on the Web these days.”
Add to this Oracle’s simultaneous embrace of Salesforce.com and desire to better integrate with it adds yet another signal that competition for cloud building requires cooperation between former rivals. The fact is that many in the know believe this is ultimately a good thing for cloud consumers.
Oracle’s John Foley, Director of Strategic Communications, thinks so and stated in an article for Forbes in July that, “We talk about the cloud as if it’s one big, holistic IT services layer that wraps around the world like, well, the atmosphere. In fact, the cloud is composed of hundreds of separate and distinct clouds and cloud services, and there’s still a long way to go in stitching all of that together. That’s why the agreements announced by four of the market’s leading cloud providers are so significant.”
Oracle’s recent changes to both Weblogic and Coherence were specifically designed to support their place in a cloud structure and to “provide an integrated foundation infrastructure for customers who want to build a cloud infrastructure, for customized applications or applications running under our Fusion middleware,” according to Mike Lehmann, Oracle vice president of product management.
Oracle now describes this set of software, which also includes the Tuxedo application server and recently released Oracle 12c database, as the Oracle Cloud Application Foundation.
This also strengthens the hand of those “so-called” open source solutions.
Companies, particularly smaller firms will eventually need to, probably sooner rather than later, jump on the cloud bandwagon. Bring your own device (BYOD), the need for all companies to have more process automation and an ability to meet ever more challenging IT needs in an unpredictable economy; all could be mitigated by a well constructed private cloud.
For these firms the idea of using a public cloud, like Azure and AWS, is too costly and riddled with too many variables not directly under their control. However many of these same entities are not financially able to utilize or do not have the expertise available to build their own private cloud. For these companies open source platforms like OpenStack allow them to design a cloud, provision the network and data center elements, as well as, deploy it at a far lower cost than proprietary systems offered by the bigger providers.
These smaller companies could benefit from seeking out a development firm well versed in both the traditional systems (Oracle, Microsoft, etc.) as well as able to build using open source systems. A well trained team of developers familiar with both open source cloud operating systems and the entire set of applications and environments it will build the cloud from can build a robust private cloud at a fraction of the cost of these mainstream systems.
Even if you are more inclined to follow the Oracle Weblogic route and build either clouds or individual applications to interact with a cloud, go mobile or simple use as a marketing tool, many of these smaller outsourced application development teams are well suited to helping your business benefit from these advancing technologies.
One thing is for sure, the cloud is not going away and the various forms it will take in the future may well make the difference between a company winning in its marketplace…or losing.
How Microsoft has been quietly pulling ahead in the BI market.
Gartner’s Magic Quadrant report issues in February of last year was the early warning signal that the Microsoft’s long time effort to build the best of the best business intelligence (BI) platform has given them a leadership position in the space.
Part of the focus has been making BI more directly accessible via Office and SharePoint 2013 with the later picking up some very helpful features. For one thing administrators won’t need to configure Kerberos as its need is eliminated in SQL 2012’s SP1. In addition PerformancePoint Services that are build into SharePoint 2013 give support to “Analysis Services Effective User” once again eliminating Kerberos’ delegation when you are using a per-user authentication for Analysis Services data sources.
In April Technology Spectator contributor Fredrik Tunvall reported that Microsoft was working even harder to “simplify its BI platform from a technology and go-to-market perspective. In his opinion the major issue isn’t the tool set but rather Microsoft’s lack of “ability to clearly communicate its BI strategy.” He went on to point out that the big M’s latest offering is made up essentially of three core products; SQL Server, SharePoint and Excel.
Noting that SQL Server and SharePoint form the data management, governance, administrative insight, security, and of course collaboration capabilities but that it is Excel’s almost universal hold on end-user that is most critical. This is true primarily due to its long existence and user familiarity in the enterprise and is what really propels Microsoft’s BI solution above its competitors.
But the story gets even better if we fast forward only a handful of months.
Now comes Microsoft’s latest addition to its growing cloud, Office 365. And with it comes a set of BI tools and a few capabilities even its stand-alone products can’t deliver. SQL Server product marketing General Manager Eron Kelly was quoted in a recent piece from PCWorld saying, “Power BI for Office 365 brings together our entire BI stack and offers it as a service”.
This creates for end users a sort of self-service business intelligence system and, with an upgrade to the ProPlus version of the managed service offering, both Power View and Power Pivot all via an online edition of Excel. Now even within a complete cloud environment Microsoft is planting yet another flag in the BI field.
Still most small and mid-size companies will find these intricate systems difficult to get into and may risk wasting a dollar or two trying to do this themselves. Even more important is the fact that nearly every business regardless of size can benefit from careful analysis of their ongoing data stream and the well developed BI tools available to do it.
Many would benefit greatly from engaging an outsourced business analysis team specializing in both existing and emerging MS BI systems and tools. Often these teams are available in organizations who specialize in building database stacks, warehouses and have comprehensive programming knowledge which aids in smoothing system incompatibilities. “Big Data” and even “Small Data”, the kind that has been sitting somewhere in your database for years, when properly mined by a solid BI system will yield a positive return because in the long run it will be that knowledge that leads you into the future.
For the better part of 15 years businesses have been using websites as part of their marketing strategy.
At first the new medium was widely accepted and virtually every business, with a few exceptions, felt compelled to have one. In the last ten years other mediums have sprung up tied to the Internet, creating more platforms businesses have to consider when reaching out to their audiences.
More mobile than ever whole segments of the population are locked into receiving all or a portion of their information and communications via smart phones that access the Internet. Pads are beginning to see wider and wider use. Eventually it seems reasonable that they too will a common device marketers will not be able to ignore. Add social/content marketing and the Internet search component, with search and email still having a commanding lead over social network marketing in actual sales, and suddenly the website seems to be…displaced.
Popular “Inbound Marketing” companies simply take the same “closed loop automation” tool set and sell it to you in a managed service. The truth is that their methodology is sound but they are not really giving you anything that an internal strategy with a website robust enough to provide you with the same functionality can’t provide you.
The cost of these services is for many small to mid-size businesses, usually monthly billed as a “managed service”, prohibitively high. And it still requires the client to engage internal sources for the actual content utilized or incur additional costs by engaging outside sources.
These managed services work, provided you come up with good content, but most small to mid-size businesses would do well to reduce their automation costs so that they can pour more into the content development. In a closed-loop situation, content of varying types is critical and the system is only as good as your content is compelling for your target audience.
But that is only the closed-loop automation part, there is more to the puzzle.
Close-loop is highly dependent on email. It is lists of willing contacts who are already onboard with receiving your messaging that are the end game. How you get them to that point requires outbound messaging, strong search optimization methods (paid search in particular) and, once again, content that has legs beyond the email format. Video is one strong component. The use of and deployment of things like white papers, infographics and blogs also contribute to getting folks to give you their email contact info and “opt-in” to your email marketing to them.
Both organic and paid search will give you that “pump priming” action but again they need compelling content, regularly deployed to your website or utilized paid search as a come-on to get them to landing pages pitching the content availability. If the content is compelling enough they will give you their email to get it. If it isn’t…well they might give it to you anyway but the total number of folks doing so will be greatly diminished. Good content, clever content, compelling content abounds these days and the bar is rising for the quality of content that will get your audience’s attention.
Mobile presents yet another challenge. Do you do this now in a different format than your website and invest more in mechanisms than in the development of solid content? Or do you take advantage of new website frameworks and look at solutions that will give you all the tools and capability needed? So many questions.
BTU’s new “Fluid Web” service is our answer to helping sort out this need businesses are now faced with. With FW we build-in all of these capabilities into one web platform. Not a service that you pay a huge monthly or yearly cost to. It’s your web. Your marketing tool. And at a cost that leaves you with budget left over to build the most important part…the content.
If you would like to see just how a “Fluid Web” will give even the smallest of businesses Fortune 500 level marketing, sales and lead generation power…businesses just like yours…call or email me personally. You won’t be sorry.
The span of time usually associated with the turnover in technology is about how long ago I started my 5 chapter deep-dive into business reinvention from a web or Internet centric starting point.
However after low these many months I have to revisit the work to now include a joint center that is better defined as fixed and mobile web centric business modeling. It is absolutely clear that the combination of app and cloud services development is benefiting from the sudden influx and marriage of independent open-source folks from both the web and mobile industries. Perhaps it is the launch and further penetration of IPhone alternatives the most robust being the Android market. Add the support of Flash in the non-Apple products and that whole Flash game, application and just for fun development community extends further the available content for this new venue.
The synergy between the development of collaboration/project management, merchant service, logistics, customer relationship management, customer service, and sales force management, provisioning, order processing, payment and performance tracking that can be obtained by systems that treat both fixed and mobile devices as equals and in need of continuous sync is perhaps some of the most powerful use of tech integration since the birth of ecommerce websites.
What I like most about this new rush to develop innovative new services, tools and applications that integrate properly into these two platforms is that it allows for many businesses that currently do not use new communications and data driven technologies in marketing and sales to have tools that are engineered to meet their unique channel needs. These are often businesses that, for a wide variety of reasons, are not engaged in or see value in the more mass consumer methods like web, social, digital media, and online B2B networking. They are the ones who resist involvement in business social networks viewing them as “time wasters” (code for I already am wearing too many hats I have no time to sit and “participate in the discussion”) and see no value whatsoever in either web or mobile ability to drive lead generation or sales.
Mobile, particularly smart phones and pads, however are the biggest component in this particular “integration” scheme as GEO allows for both logistical capabilities and location tracking of sales, delivery and other deployed human assets. Add to that mobile CRM, smart phone payment systems (credit card readers, barcode, mobile ecommerce sites), order entry, inventory search and countless yet to be developed apps. Integrate equally robust fixed hardwired broadband connected devices in continuous sync and your road staff just got immensely powerful.
I am increasingly convinced that very nearly all new business models regardless of size or scope will need to begin their business model, strategy development and eventual marketing planning with both fixed and mobile web at the center of their overall communications and channel development. And don’t get me started on when HTML5 takes a firmer hold on how resident video and audio (among other aspects) affects or eliminates Flash or Silverlight as necessary for playing video.
Now I have to go back and rewrite the book…stupid 18 month rule!
Another post driven by a LinkedIn discussion. In response to a general question about leadership and the top down vs. organic business organizational models.
You may lead but will they follow ??
One of the hardest parts of being a leader is getting your employees to follow. I lead by example. I became certified in everything my employees were certified in.
How do other leaders get their employees to follow?
Say what you mean and mean what you say. Don’t think about leading or connive to do it…do it. Be honest even if you don’t always have good news or are communicating your own error or misjudgment. Lead by example…sure…but allow others to also lead and occasionally you follow. Look outside your industry and don’t live with blinders on. Set your expectations and then communicate them clearly. Draw the leadership line when it comes down to a decision and take responsibility for that decision…this is when you lead and they must follow. Encourage challenging you and be flexible when your team alerts you to needed planning changes. Have a plan…don’t just wing it. If you “wander around” be sure you are managing…not micro-managing.
It should be noted that as much as we would like to find a perfect formula or model it is very much situational, driven as much by leadership “talent” as it is by knowledge or system and that there are success stories for both the top down or the organic model.
The truth is that in all too many organizations the model is actually a top down that pretends for the sake of [pick your favorite reason for attempting to mislead your own employees and write it here] resulting in an organization that generally has diluted responsibility sufficiently to hide ineffective leaders for years at a time. If you pick either model you must follow through and lead with consistency.
Leadership is an art. Good leadership is organic, evolving.
Well as promised I wanted to update this story as it has been one of the most remarkable applications of LinkedIn I have personally experienced in the several years I have been a participant. As you may recall (if you are a regular reader) I mentioned that I had joined forces with a couple of folks who have a Fortune 500 type of background like I do and that their focus has been on seeking out and aiding mid-Western US small to mid-sized manufacturing companies (and as a result a handful from manufacturing like business) to aid them in getting funding for growth and to realize opportunities in the new economy.
The response was nothing short of fantastic so far and has led to our launch of a new business that incorporates this effort and a set of follow on services provided by well experienced, high level veterans in the manufacturing, business and marketing strategic, and international business development. We found multiple companies who not only were surviving but had multiple opportunities that we are leveraging to give them the kind of future success many had thought impossible due to the disconnect between traditional lenders and US manufacturing.
It also made it clear that reports of the death of US manufacturing, particularly small and mid-sized companies are very much premature! There is a growing need for and rationale for seeking out US based sources for a variety of manufacturing needs. Regionalizing logistics to reduce both cost and the effects of long range shipping on the environment, facilitating relationships between companies to source in a way that takes advantage of “economies of scale” and building stronger regional economies all make sense as the US climbs out of its self imposed economic crisis.
We continue to seek primarily Michigan and businesses in adjacent states as we ramp up to go much broader in January of 2011. Once we have launched this new effort we are planning on extending the reach of our efforts to include more US companies regardless of their location.
One thing I will note is that there are still far too many companies who could benefit from this off the LinkedIn platform. After meeting with a number of the folks who did get to us, and a handful we found outside of LinkedIn, it is clear that many manufacturing execs are not participating in LI simply because of the intense “social marketing” hype that clouds the aspects of LI that are more relationship building. That plus the fact that many view LI as becoming more and more of an HR recruiting mechanism causes many to take a pass as they see no real benefit to them in their smaller scale environments.
We are seeking to change that as well and hope that those folks who are a part of the LinkedIn experience and have access to these decision makes will eventually help to convince them there is more to it then hiring and “social media gurus”.
Stay tuned…more about this later!
Funny thing happened to me the other day when I was out looking for folks in need of more sophisticated marketing services than just the “I need social” or “I need a spiffy name for my new venture”, mind you I do enjoy these types of projects but not as much as putting more difficult puzzles together. Part of that search involves reconnecting with my own “social network”, particularly those in my immediate area that have not been as well cultivated during my years of national and international business focus.
Another part of that has been really coming to grips with the difficult situation my home state has found itself in now that it has born the weight of the economic downturn. As I have long established roots here, not to mention six kids, ten grandchildren and neighbors who are all suffering the effects, I want to be able to make a contribution to efforts that will provide real help growing our local economy.
The funny thing that happened was I found out that for some individuals in a position to really help grow our manufacturing businesses, finding the companies to help sometimes was difficult. During a recent discussion with one such set of folks I introduced the idea of utilizing the more focused discussion groups on LinkedIn as a way of “inbounding” inquiries from the companies who need help.
Now of course this is not at all unlike what Hubspot.com preaches in its Inbound Marketing University although in this case we are offering money…not offering to take it! Out approach is simple and includes a carefully drafted discussion post outlining the type of companies we believe we can help, the range of investment opportunity we are willing to work with and how to contact us in private. We also made sure that we contacted the group manager if we thought our post might be considered spam and included a disclaimer to the group requesting them to contact us if the post was not appropriate.
I’ll keep you all posted in the near future on how well this worked but it is an excellent example of how to creatively use social networking to not only benefit yourself but others as well.
The Internet works like a conveyor with multiple parts. Parts whose independent actions work together to move prospects along to become customers. It is foolish to say that only understanding and implementing a single part will bring success. The beauty of today’s marketing universe is the vast number of options for cost effectively integrating multiple strategies to build a perfect conveyor, one that moves your prospects along to become customers.
The Internet is not just the Internet anymore. It is everything that touches the web. It is everything that can access the web. It is every technology, mobile and fixed, that can drive your customers to a web site, deliver your brand, interact with your audience, alert them to a deal, or a location, or whatever turns your prospects into customers and your customers into cash flow.
The Internet, and most marketing efforts, continue to require of its most successful users a solid understanding and occasional use of traditional media. Knowing what mediums, how much to spend, how to buy, how to measure, and how to translate those “parts” of the conveyor to work with all the other “parts” is critical. These often include parts that exist on a new technology or involve driving your prospects to new forms of customer interaction.
The Internet, regardless of what the web “experts” say, is no longer just about inbound vs. outbound. It’s a mash up of both and sometimes it is not about inbound nor outbound but about the conversations you have, your prospects have, your customers have and the conversations they listen to. To leverage those facts you need to integrate social and PR into your entire mix, remembering that they are also ad platforms as well, and not lose sight of their importance even when they seem “indirect” in their bottom line impact.
In the “old days” the big marketing departments and ad agencies were all about “integrated campaigning”. They were also all about “the brand”. They were quite right to be focused on these general strategies.
Today’s marketing requires you to continue to consider both of these “general” strategies to be critical components to your marketing and public relations. The only difference today is that you have a mind boggling array of messaging and interactive channels available to you for use in integrated campaigning (whether they are inbound or outbound or a mash-up of both) and the definition of “brand” has taken on new complexities with the reemergence of social image needs, personal brands and old brands mutating into surgically altered younger versions of their former selves.
I can see how this would confuse younger less experienced marketers and I am sure that many executive leadership teams, who have often been ill at ease with the more difficult to quantify mysteries of motivating human behaviors, are beside themselves trying to figure out whether or not to support any of them with ever tightening budget dollars.
If executive teams were confused before when they saw in your plans that you intended to integrate messaging via print, radio, TV, and billboard, then having everything from consumer social networks with ecommerce applications to mobile and proximity marketing has probably shut them down completely.
Their solution is likely to be to assume that only one or two of these are the “magic bullet” and proceed to seek, hire and/or fund folks solely focused on those “magic bullet strategies”. It’s what happened with dot coms, and before that telemarketing, and before that direct response…in fact before every “new” ad-scheme-to-end-all-ad-schemes since the dawn of marketing.
And…they will find many over zealous folks around today, particularly in the B2B social networks, who will absolutely proclaim that there is only one or two that matter. They of course do this speaking generally and without doing one bit of analysis into your audience or their current behaviors or any evidence whatsoever.
Don’t fall for it.
Take the time and understand your prospects and customers thoroughly. Employ a firm or individuals internally that have both the width and the depth, including a strong understanding of more traditional channels, to help you define your strategies and select the proper messaging and interaction channels for implementing your marketing, PR and sales. Only after you are sure what components need to make up your “conveyor” can you build one that will bring you customers.
The one thing that’s sure is that your audience’s attention span is short and gets shorter every day. Multiple channels, designed to deliver your messaging, drive conversations with your audience and build your brand, ensure that your short messages combine eventually into compelling arguments for your prospects to buy your products or services.
So the next time so hear someone say “it’s all about social” or that “it’s only the new media” or that “XYZ is dead”, think twice and find yourself expertise that is savvy in identifying the best pathways to your prospects and customers…not one who insists there is only “one way to skin a cat”, and let them help you fire up your conveyor.
Better yet…call me.
Again I am posting here some thoughts I shared in a LinkedIn discussion (kind of the reverse norm for LI I think) so again they are brought into this environment where I hope to speak in every increasing detail about a variety of subjects I believe get more hype than substance in more general social and B2B networking sites.
The question posed was, and I paraphrase here a bit, “are social networks replacing advertising?”. The multiple comments prior to my inputting had been simply wonderful with some of the most balanced observations about social network marketing I have seen without a heavy interjection of the hype, almost religious in nature, from real hard core social networking devotees.
I of course went slightly a different way:
What I find most interesting about these discussions is how they seem to discount even ignore the fact that although we call these “social networks” and they have evolved, as far as user interfaces and site functionality is concerned, in many ways out of the affinity network sites which have been around for years. These sites in turn grew out of the BBS and Undernet chat rooms of the Internet’s distant past (more BBS than chat).
Unlike the affinity sites (I am most familiar with those of a music industry nature…talkbass.com being the one I was most active in) the social network is based on relationships in the real world (and those I suppose you now establish virtually), both old and new, whereas the affinities have some hobby, professional niche or other common interest that is more specific than I know you and you know me.
These groups have always had to balance advertising and more important to this thread, how discussion based interaction between a (for ease I will use a talkbass.com example) member of the group who is also representing a service or manufacturer who wishes to sell to this audience without it becoming a flaming bloodbath of hostility from other participants passionate about their art but not willing to be marketed to in a discussion forum unless it is clear that is your purpose (they do it by identifying in the online “signature” the individuals endorsement or employment status so that other participants can weigh their comments or “buying suggestions” accordingly. Often the folks who are industry reps and there in part to bring back customers are also the most helpful, have an obvious ability to be honest and even handed in their recommendations. Many will suggest another manufacturer or service when it is clear they do not have the right solution for the participant.
I will also point out that it takes a considerable amount of effort and time to get to a place where you are accepted by the group and certainly after the hard work is done there is evidence that it has had a favorable affect on the business as many have been involved in these types of sites for over ten years (yes Virginia there were social networks of a fashion prior to YouTube, MySpace and Facebook) and in their particular niches have become experts at integrating and using both site advertising, discussion leadership and participation in day to day discussions that have a positive affect on both the participants and the web community they are part of.
Much of the discussion I read here at LinkedIn concerning social network marketing seems to not recognized the evolutionary path that social and business networking sites sprang from and as such often don’t look to those past, and in many cases still active, niche affinity networks and sites for clues on how to effectively market in both B2B sites (like this one) which are really the closest to an “affinity” network as well as how to do it in a more “social” environment as the more sophisticated affinity sites have well integrated approaches to both site ads and “discussion” marketing.
As for the original question I prefer “evolve” over a prediction of a specific end to an activity relating to the marketing and selling of products and/or services to humans…I wanted very much to predict the end of disco but I hear it’s coming back…like bad 50’s rock…to state at this time that any of these networks will replace and eventually eliminate all other forms of human communication is far too “crystal ball” for my sensibilities.
This of course precludes the eventual inclusion by Walmart of every conceivable product and service known to man and their complete enslavement of the entire human race…at a low low price and questionable quality…at which point they will outlaw advertising and this discussion will be rendered moot.
The original discussion can be found here: http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers&discussionID=15877628&gid=145854&commentID=13659722&trk=view_disc
Several of the comments thus far are very insightful and I encourage any readers of this blog to visit those as well.
I read a strategy professors use of these two musical genres today as a way of showing the difference between and the downside of two different management styles. according to him, an organization that had grown or was growing beyond what the traditional entrepreneurial skill set could continue to provide successful outcomes…or so he was trying to demonstrate.
Although I agree with the metaphor use and think there is value in using them, his knowledge of music was limited so I replied with the comments that follow this brief introduction. I am also posting them here as I do believe it to be an excellent comparison to make and wish to ensure that my thoughts on the subject are available to folks foolish enough to read my blogs!
Here are my comments:
As both a strategist and a Jazz musician of no small prowess I must say that although I agree with the general metaphor, you unfortunately lose me in paragraph 3 with statements made in an attempt to draw the parallels (jazz vs. symphony) and yet your knowledge of Jazz is lacking hurting somewhat your use of the comparisons here.
It is overly simplistic to focus on improvisation and too tightly tie it only to raw talent. It is hardly the truth as the accomplished improvisational player can only function well if they are first a consummate student of the rules (music theory and standard notation at the very least). This is precisely the same for those who are at a professional level in the symphonic arts as they too must know these rules within exactly the same basic structure. There are some rule and structural (or at least rules for when there is a lack of structure in the case of Jazz) differences between the two genres, but I think the impression that it is just free flowing self expression that comes without direction or plan, simply blowing where the wind does…is not correct.
I believe it fair to say that strategy with a symphonic metaphor would be heavily bureaucratic, orchestrated with each member in perfect sync and following strictly with little or no room for any flexibility, other than perhaps an occasional first chair or soloist with a enough room for dynamic “personal expression” but not necessarily more than touch, volume, a tad bit of note length leeway.
Jazz, as a strategy metaphor, also has structure (and a very strict one in the case of many modern jazz pieces), and is still ruled by the same system that the classics are. Jazz has allowances for some deviation from certain scale and mode rules, starts out with and understanding between players of the end game or way to conclude the piece. It may provide within the structure multiple paths for exploration and experimentation; but with firm cues or specific “anchor” parts or players so that improvisation may move along different paths but still have a way to return to the structured end-game.
Each player is in “tune” with every other player in real-time (the musical shared desktop), often communication in the form of a specific familiar musical phrase(s), or rhythm or eye contact or body language or all in combination. It is a language that is almost spontaneously creating itself at times. They are also musically synced with each member of the ensemble with certain key rhythm section instruments even more acutely in “charge” of pulling things back into time or tonality so that the ending of whatever happens improvisational, still sounds “tight”. Does any of this sound like real-time shared collaboration in a business context yet? Do you still think that jazz really just plays it fast and loose?
I admit that these are very different strategic approaches but I assure you just having “raw talent” won’t get you far without a thorough understanding and appreciation for all approaches. I also will tell you that Jazz isn’t all loose, with the exception of some forms that frankly are about as out there as many a business model I have reviewed…and just as commercially and critically successful. It is in fact a carefully orchestrated form that is more about creating and guiding chaos into pleasing musical forms.
I prefer Jazz, musically and from a business strategic/management standpoint, but will tell you that my preparation for Jazz is far more intense and has a much more demanding nature than that of the symphony. In a symphony you just have to be able to play in time and read the music. Skill is still a factor and I have much appreciation for the skill of a consummate member of a symphonic organization. In Jazz you have to understand every theory, listen to every part simultaneously, and play as if it were flowing naturally from you with every breath. You can’t just read the music you must become the music.
Your suggestion that somehow only an entrepreneur and then only at a start up stage should be the Jazz strategist and that once scaled to a larger size such a strategy as Jazz can not be the backbone of a larger company…is absurd. However the size of the community of folks who can in fact “play Jazz” into the billion dollar range, particularly a team of them correctly suited to each other and the task is once again up to the “general’s” ability to identify and recruit the right ensemble.
Symphonies on the other hand in the near future will fail over and over as all are led by folks unable to play Jazz. In fact the best world would be an ensemble led by a consummate player who has experience and talent within either a classical or jazz setting.
You indicate also that you feel that the leaders relevance recedes as the jazz plays on into the more mysterious realms and I would argue exactly the opposite, it is the leader (particularly a strong one like a Miles Davis or a Chick Corea) who takes the whole thing out on a limb and then deftly brings it back to a musical conclusion…an attainment of their objective. Had you ever tried to play “Spain” with an arrangement for live play that leaves it entirely up to the soloist to determine direction and length of their solo section, understanding the whole time that once chaos has ensued you, particularly the supporting team, must return to a particularly strict and difficult to play piece of music…you would likely understand what I mean.
Speaking of a little Jazz:
If you would like to read the original article in question it can be found at http://blogs.hbr.org/tjan/2010/02/strategy-as-jazz-vs-symphony.html .
Besides if you think a jazz style management scheme is chaotic…wait until we hit the rock and roll stage of business management!
Whether you believe it or not, there have been recessions before. There have been thousands of businesses that have seen tough times in the past. They managed to survive and, best of all, grow. For many their growth was greater than what would have been possible in a non-recession economy. These were businesses that saw opportunity in the down economy, made adjustments to their businesses and seized the day.
The good news; if you are willing to adapt there are more tools than ever available to help you “seize the day”. Tools that will help you look for and take advantage of new opportunities, grow your revenue, and even expand your market footprint. As with any set of tools there is an equal set of rules for getting the best out of them.
First the rules:
Stop and be honest…
…take a step back and really assess the state of your business AND marketing. Not every tool will fit your business. In order to succeed in a tough economy you need to be honest about your strengths and weaknesses. It is equally critical to honestly gauge the effectiveness of your marketing. Understand these and you are almost ready for opportunity to knock.
Don’t operate in a vacuum…
…figure out what’s working and what’s not. Set up Google Analytics on your website, code your mailings, and ask people where they heard about you. Also ask your front facing employees to ask, and keep asking. You need to do whatever it takes to really understand who your audience is and what they respond to.
Mine your existing customers…
…and understand their needs. Understand why they continue to do business with you, and more importantly what they like, what they need, what their customers need. Email a questionnaire to your top customers, hold a focus group at your place of business, take some of them out to lunch AND really listen to them. Keeping these folks as customers means being in closer contact, accessible all the time and serving them better than your competition.
Use this data to build a new business plan…
…and then write a new marketing plan. This is your opportunity to challenge your business to explore new ideas, new markets, and reconnect with the customers who got you this far. To survive today you need a good plan built on solid data. Armed with those you will be ready to seize your opportunity to grow your business.
Accept that changes will be hard…
…and then make them anyway. Many of the options you have will be unfamiliar and you may want to run to the safety of what you know. Trust your data, don’t make excuses for bad habits and write a plan that is built on reality. If your plan says “embed your phone number in a web ad and never again use yellow pages”
…take a deep breath and do it.
Now the tools:
Business Social Networking (LinkedIn, Plaxo, Twitter)
Business Networking is just like networking in the real world where we do it as individuals who are part of an organization. Demonstrate knowledge, skill, and expertise by being a resource to folks in the network. Be the person who knows the person and even give information away if it builds your network. Then get your employees to do the same. This is the best way to utilize these types of “business social networks” to build credibility, find new partners, connect with your customer’s networks, and establish new prospect relationships.
Digital Gear & Gadgets
Voice over Internet Protocol (VoIP) communications tools are giving an increasing number of businesses greater mobility and flexibility in the way they communicate. “Gadgets” have made it possible to be more connected to customers, accessible to prospects, and some allow you to seamlessly integrate your personal and professional communications. New mobile data access allows you even more mobility and can make a business sharper and faster with information that is always available in real-time. If your plan calls for closer contact, better customer service or growing a bigger footprint, technologies like VoIP may be the answer.
It used be hard to get published and big companies spent thousands of dollars trying to get paper publications to talk about the advantages of their products or services. These days though, the world seems to belong to the “bloggers”. Blogging requires frequent updates and postings if it is to be lively and attractive to regular readers. However it takes time to build an audience so be sure you are willing, or have authors onboard who are willing, to commit daily time over many months for the best results.
Social Network Marketing (Facebook, MySpace, Twitter)
Social networks are large parties on the Internet built primarily to have fun and to communicate with friends, family and new folks met online. If you are “selling” in that party you are largely going to be ignored. If you are there for brand building be sure you don’t “hard sell”. What you need to work towards is being the life of the party, that’s what will get you attention and help build your brand. If your business is B2B evaluate the audience in these types of “social networks” carefully. Many B2Bs have found marketing to businesses not as successful in these environments and more impact from “business social networks”.
YouTube (Viral Video)
YouTube can be used for viral marketing and in some other useful ways. If you use it for viral marketing remember it is what it says it is, “viral”. That means either it will spread or it won’t. What successfully spreads on YouTube is all about what gets attention, what is entertaining, and even what is shocking. For B2B YouTube can be a good place to store video for demonstrations or providing technical help to customers or potential customers. For viral, short presentations, low budget video, attention on entertaining content (dull is to be avoided here) all are what works when using YouTube. With YouTube, and many other viral marketing efforts, frequency matters…the more you post the more attention you get.
A Look at Internet Centric Business Modeling…
…or Some Really New Ideas and Some Old Ideas Dressed Up as Really New Ideas…
…or How I Spent My 2009 Sabbatical
What else…a disclaimer.
Summer 2009 was kind of slow and cold, which made for more of a reflective time than one of hectic productivity. My attention centered on the changing landscape of a nation as reflected through the fall of one small grocery store in a bad mix of timing, technology change and horrible economics. Of course, as things do with me, I found that this one small event turned into a much larger and further reaching range of thoughts about what opportunities might emerge from this difficult time.
I admit here, where you can decide right off the bat how much further you will indulge me with your reading, that I am neither an expert in the grocery retailing industry nor do I claim any particular expertise beyond years of leading successful marketing communications teams for sizable corporations. If I have any expertise in the areas I describe throughout the words that evolve during this blog posting, it has been acquired through 40 some odd years of observation, self education and the good fortune of having a large network of very intelligent folks who have, mostly through osmosis, managed to instill some small understanding of the business dynamic in me.
What I am about to recount to you are the observations and ideas for not just including the Internet in a business model but rather re-thinking a business model with the Internet at its center and used to resolve specific problems, not just for “Wiz Bang” purposes. This combined with a belief that a “re-connection” is long over due between Internet commerce and the brick and mortar “real world” that we all live in.
There are models that I have “stolen” ideas from and signal, to me at least, that others perhaps smarter than I are already making this reconnection happen with their businesses. They also happen to be models that reinforce some of the other observations I will make as we proceed through this “business model makeover”. I am also taking lessons from “dot com” history and combining them with current trends (and woes) to shape what I think is a reasonable look at a successful future business model.
Why the multiple titles?
I, like many folks who have focused on copywriting for others but have seldom written anything in thier own voice, wanted to do something in detail and use whatever language, style and silly comment I wanted to. The “academic” title slash “silly” title approach just started out as a reminder that it was my deal and I should write it my way. Pay no attention to these titles other than to have a laugh if it suits you and to help you navigate through the chapters five, as that is my target length for this little ongoing rant.
I am counting this chapter as one of the five so that means your only four chapters away from ending this silly thing.
On to the soapbox part; Chapter 2.
II. The Internet Centric Business Model/Why the World Must Change
Larger cities must shrink. Larger companies must shrink. Companies in general need to become more interdependent and regionalize logistics. A more distributed population, workforce and employer system needs to be created that provides for less population concentration and that tend to result in large, difficult to maintain and impossible to manage infrastructures. Industry and government need to work together to use all the available resources and provide for greater focus on effective management of the overall infrastructure, particularly the reach and availability of broadband data access.
The above is true regardless of country, nation, religion or other predefining description of the economy or culture. The “animal brain” that was the old Internet must give way to a more organized operating system so that this new matrix can be reconnected and directly interact with the real world.
For me this was what I saw as the overreaching problem, both economic and societal. We have made the world smaller via the Internet but we have been concentrating too much into too little space for much longer than the web has been around.
Now by space I am referring to just about every way you can over concentrate our human society. We cram too many people into too small a geography in every major metropolitan city in the world. We have too many industries with too few competitors in them and too many trying to serve national and international customers from too centralized and concentrated power centers. Workforces are either feast or famine in whole regions and when they don’t do well they tend to drag down the rest of the economy. Like dominoes they fall with rippling affect and take market after market, country after country down in their wake.
I say this with reluctance because I believe in free economies and capitalism in general. But I believe that in order for the world economy to stabilize permanently and for there to be any chance of a calmer and more productive world we need to change just about everything, including introducing ideas into the capitalist system that clearly could be viewed as “stolen” from a more socialist or even communist system models.
Communities need to evolve around the growth of local industry regardless of whether that is expressed as manufactured hard goods, the cultivation and marketing of farm goods, the creation of electronic content, the development of software, or the overall acquisition, storage and use of data. In other words cities must strive to begin anew with hand in hand efforts to cultivate more regionalized economies that are tied directly by the Internet to the world economy.
Assets from infrastructure items like unused or underutilized warehouse space, manufacturing facilities abandoned or with an option to repurpose, office and industrial parks that are not well populated, and any other facilities or silenced equipment that could be cataloged, turned into searchable data and delivered as part of the resources for creating new businesses.
In turn these assets themselves can, and should be, evaluated as to what they need to further drive business occupation and use. In general this will be about broadband access; as in many cases these facilities, when located in smaller and/or rural locations, lack affordable access to high bandwidth services. By offering the analysis as motivation to deploy to specific facilities more affordable broadband these locations improves their chance of being matched with a business venture that will improve the overall economic and employment conditions in smaller towns and communities.
Increasingly fair cost (notice I am not saying “low” or “cheap”) and high value will be what business and consumers alike will be seeking. Helping existing businesses to adapt and find ways provide fairly priced, higher value products and services are absolutely critical to moving Michigan (where this case study is derived from) and the entire US economy back into the “thriving” category.
Most of what needs to happen is about the more inefficient and wasteful habits companies have developed over time. Most of this is how willing we are to ship steps in a process all over the country and world without any regard for sound, efficient logistics. Now that we also need to be concerned about the energy those wasteful shipping steps consume, whether concerned for cost or for the health of the planet certainly makes no difference anymore, shipping here and there for no good reason is just stupid.
Now that it is totally clear that the “reset button” has been pushed there is no time like now to begin a meaningful metamorphosis of our business models to ones that adapt to the new market economics and are built to ride its ever rising tide. The ugly side of this reset for some is that the return to anything like the inflated booms of the last two plus decades is both unlikely and should not be welcome.
We just plain over did it. And yes I am stating the obvious. In fact for this whole tome so far I really feel I may be stating what is obvious to anyone paying attention.
III. The Grocery Web Business Model/What You Do While You Watch a Business Die
The Small Town Grocery
I have recently taken a look at a large grocery wholesaler and in particular the brand of grocery stores that once held a monopoly on my small town. As one of these stores is in my little town it is a model I can see in action and have for many, many years.
A few years ago Meijer’s™ (a larger chain of stores similar to Walmart™) moved in across the street and the store has struggled ever since to keep its doors open. In fact all of the stores owned by this brand struggled so much so that the chain itself, a privately held business, was sold to their grocery supplier, a generic product labeler and wholesaler, of many years.
What is interesting to me about this particular store is that in its case it has a liquor license and the larger “super store” does not. As such some local business continues to be driven there by the sale of spirits but not all of the liquor buyers are also converted into grocery shoppers and in fact only a small portion do in fact shop the rest of the store.
As a grocery competitor they cannot compete without a significant differentiator that would cause buyers to pass on grocery shopping at the super store. Unable to keep up the kind of diverse inventory and level of service personnel, without sufficient cash flow from to overall grocery business to support any effort to compete “head to head”, leaves them with shrinking customers, personnel and the ability to maintain services. This of course makes them less competitive and continued service shrinking will only result in the store closing eventually as its customer base shrinks away.
Old Idea, New Twists
Now let’s introduce an old idea with some new twists.
There are still a number of “online groceries” but most are mail order style home delivery types of businesses and have no real physical presence at all in the “real world”.
I have always been intrigued by the idea of the online grocery shopping experience but can never get passed a few aspects that would get me to use one and use it regularly. My primary one is the lack of ability to pick your own fresh produce, meat, and other grocery items that I like to pick out myself rather than receiving seemingly “source unknown” packaged meats or other perishables. Produce also does not always ship to the home well and waiting for your groceries to arrive via shipment seems less gratifying than just shooting over to the store and shopping.
Convenience alone won’t drive business to an online grocery; even savings alone can’t do it. But combine the online service with a physical location designed to accommodate the purchase of wine, beer and spirits, fresh produce, meat and other more perishable items as well as providing for the rest of a shopping order to be picked up and paid for (if it wasn’t paid online already). This would mean a very different sort of store layout but would require a significantly smaller frontend footprint and fewer overall staffing requirements.
So in the case of my little local store, with its valuable liquor license, removing the need for dry good and other shelf items to be in the physical front end of the store means the space utilized for storing these shelf items can be utilized more efficiently.
No need for special service counters for over the counter medicines or tobacco products that can’t be on open shelves, no need for a frozen section, for the most part even packaged dairy products and other types of cold case items would not need to be in the physical frontend of the store.
A deli/meat and fish counter, room for fresh produce, some wines and spirits (most liquor and beer would be other items generally purchased online and picked up, immediately reducing liquor theft which I understand is a big problem in many of these stores) and perhaps some impulse items scattered around the now smaller store footprint.
Check out would be a matter of adding anything you selected fresh to your already package order and paying only for items that were not paid for online. Packing orders could start immediately after receiving the online order and pickup could be designated as 30 minutes to one hour out in order to help control customer flow. Kiosks that allow for additional ordering of items you may have forgotten will also need to be available.
Now reducing many of the overhead issues and bringing as much of the customer experience as possible online should allow for greater flexibility in product pricing, incentives for buying generic (this is especially true of this organization as they are more a generic provider and wholesale operation who has taken on stores in order to keep their distribution channels open) become possible with the ability to provide side by side comparisons of contents and customer reviews of product quality as it relates to a brand vs. generic product.
Even brand names could be sold at more aggressive pricing if all other cost containment efforts made possible by this hybrid online/brick and mortar approach are fully realized. It all could ad up to a very nice value proposition for the store and one that their competition would be slow to adopt if they adopted it at all. This is the primary marketing differentiator and if aimed at the right target audiences could be worked marketing wise in an effort to appeal to that audience’s need for more time, less dollars spent on groceries and desire for a less stressful shopping experience.
It would not displace the bigger stores as they will still carry items beyond grocery and not all folks are looking to avoid the “traditional” shopping experience. It would however create a new customer base which arguably will grow over time not shrink in an environment that is highly flexible and easily can adapt to changing customer needs.
This particular chain continues to have a few “video rental” stores open but they have suffered from bigger chains with larger selections and more aggressive pricing. In this “online” scheme they could reinvigorate that portion of the business by also adding the video rental inventory online and provide for pickup/drop-off at the same location (I might suggest a return by mail option as with Netflix or even a partnership with an entity like Netflix with private branding for the store and Netflix distribution system as the engine) as the brick and mortar outlet and/or some additional home delivery options that could be developed if a need arises.
As the example above reflects these are dramatic changes to a business model. They do not accept that the current model works and insists that the old model has to be scrapped to make way for an engine that drives the business forward and allows for customer and revenue growth.
I expect that if I were to walk in and propose this to the organization that I have used in this example they would run me out on a rail, more than likely tarred and feathered!
They would be wrong to of course and without looking at their books I can certainly understand how this will be a daunting shot at something that cannot be guaranteed 100% to work. However I believe that in the case of this particular business it is the way to not only start to grow their business again but could easily be a model that they could spread beyond their current footprint or offered to companies in a similar pickle as a turn key model and provide for a new revenue source to add to their cash flow.
As much as this speaks to the integration of the web into the model it also serves, and this was another aspect to why I felt it was really worth expanding my thinking on, to attack head-on one of any retail operations greatest problems, shrinkage and loss through theft.
This is the simplest aspect to communicate and why it is last in this section. In the store layout model described above theft could be reduced almost entirely and at very least limited to only those items available in the now reduced store front end. The rest becomes a cake walk thanks to it becoming a more data driven warehouse type of fulfillment operation. This obviously provides an environment where even employee theft is far more controllable.
Here is the Web 3.0 Part
Now that we have solved some major problems and probably saved the business dollars already let’s start the sticky stuff.
Now that you are driving folks to an online store you begin to take advantage of the social networking, product comparison, ad and special offer generation (based on end-user shopping data), perhaps games, perhaps a “shop smart” club etc. etc.
At the heart of this the collection, storage and application of shopping data gathered at the customer’s level. Implementation of as many “one to one” online ad, game, and special offer user experiences give further opportunity to fine tune that data.
Understanding what they currently buy, particularly if you wish to drive more generic product sales or more of a particular brand name than another, allows you to:
- Opportunities to do side by side comparisons of products a customer regularly buys.
- Show a testimonial from a neighbor or someone in the same town professing the strengths of the product being pushed.
- Make a special offer to first time buyers (why should such a fine marketing approach be wasted on a PC or a new flat screen TV…why not on a box of generic elbow macaroni or a particular brand of sliced cheese!!!) so that they are motivated to potentially switch their loyalty to either your generic offering or the brand you are pushing.
Facebook’s compelling social game “Farm Town” could give way to the online store’s version “Shop Till You Drop” with customers getting special coupons for certain events in the game (again of course tailored to reinforce a behavior that is of value to revenue generation, customer retention, growth etc.) that will further drive or maintain involvement…maybe even commitment; to the online store brand.
Needs! [From here on folks we are into the more uncharted ground and the content below with be edited further and added to as the list below indicates.]
To properly present this idea and aid in understanding the broad impact that this either major makeover or start up business model several things need to be developed.
- A rough layout or architectural drawing of a re-conceptualized physical store front-end
- A rough layout or architectural drawing of the warehouse, staging and packing area
- A rough layout or architectural drawing or the outside and parking of a hypothetical location
- An electronic mockup of a potential store interface and various depictions of key data driven activities as described above.
- Potentially we could get investors to fund the development of the whole code, building specs, possible liquor licenses for sale and locations etc. and go out and sell it nationally or internationally as a turn-key business solution
IV. The Regional Distribution Model/Why Big is No Longer Good and Big Communities Exist Just to Prove Big Won’t Last
Part of what drives the “IShopLocal.com” idea is the same thing that makes Netflix.com so darn convenient…the regional distribution system. Warehousing inventory close to customers to minimize shipping time and guarantee on-time delivery for orders placed on the Internet. I am sure that is not their mission statement but it is as close to a stab at an academic description for the real value of where they store the videos you rent and how they get them to you.
It’s too bad that long term, and after they finally put out a truly theft proof algorithm for digital media or give up on the idea of controlling its distribution completely (I could go either way really), even Netflix hasn’t gotten it completely right and will need to alter their model eventually. Fact is I see it as an “evolve or die” future for most of entertainment (or antiquated content development communities, as I like to call them) industry. That future is likely to also mean enclaves of creative’s developing content for consumption but perhaps no real “star system” anymore.
Instead I think that Godin is right that a more tribal way of expressing the monetization of creative works will evolve with creative folks, artists if you will, “stars” if you must, having more reasonable incomes and, although I think there will and always must be some form of class system, the strata of that they occupy will not be as far away from the rest of us as it is now.
I guess I have digressed this far I may as well continue just a little further and state that in my opinion the death of the entire entertainment, music, video, TV, book, game development etcetera, etcetera, etcetera, is immanent and it is about damn time. Once again too much money, power and well power have been concentrated into too narrow a corridor. But that is not the real problem. For particularly the record, movie and television industries it’s their long history of entangled relationships and just plain too many people who have a piece of every single pie. These industries are driving others, like the telecom, cable and satellite companies who must buy packages of channels that no one really wants to watch with tons of programs that no one ever watches, to not keep up with a public that knows full well that they should be able to just pick and pay for what they want to watch when they want to watch it.
My point is that all of these models no longer can work and all must be scraped. They will need to die as new “content” distribution models are developed and the public becomes used to their new freedom.
[I will be going into the whole vision for the “new disposable entertainment” industry in future posts.]
V. Summary/Why Should You Give a Crap
This is where I will stop for this portion. More as I continue my research and evaluation.