Development: GNOME 3.10.1 Release

Hello all,

Here comes GNOME 3.10.1, the first update to GNOME 3.10, it includes
many fixes, various improvements, and translation updates over 3.10.0,
we hope you’ll enjoy it.

We won’t end it here, we will soon publish the schedule for our next
relea…

See What`s New In The 13.10 Release Of Kubuntu, Xubuntu, Lubuntu And Ubuntu GNOME

Kubuntu, Xubuntu, Lubuntu and Ubuntu GNOME reached version 13.10. Let’s take a quick look at what’s new!For 13.04, I’ve made separate posts for some flavours with videos, etc., but there aren’t so many changes in the latest 13.10 (Saucy Salamander) release, so I’ve made a quick summary instead.Kubuntu 13.10Muon DiscoverMuon DiscoverNew User ManagerKDE Connect (available […]

Development: 3.10.1 coming up!

Hey all,

It was not announced with the usual template but we need your tarballs
for 3.10.1, Matthias Clasen wrote:

So go for it, and let’s make it shine.

Fred

GNOME Foundation: Minutes for the Board meeting of October 1st, 2013

wiki: https://wiki.gnome.org/FoundationBoard/Minutes/20131001

= Minutes for Tuesday, October 1st, 2013, 16:00 UTC =

== Next Meeting ==
* Tuesday, October 15th, 2013, 16:00 UTC

== Attending ==
* Rosanna Yuen
* Andreas Nilsson
* Joanmarie Diggs
*…

BillReminder: Unix, Despite It’s Age, Still Has Followers

Operating systems, graphics cards and processors all have combined to improve the workstation and give shape to the market. Although these factors are conspiring to generate a considerable shift towards the Windows NT or personal workstation, the traditional Unix workstation still has its loyal followers.

According to figures released last year by research firm IDC Canada Ltd., NT workstation shipments, between 1998 and 2003, are projected to have a compound annual growth rate of 15 per cent. For the same period, traditional workstations are expected to decline by three per cent each year.

Despite the Unix workstation’s decline, it will still find a home in many niche areas and among companies that have made large purchases in the past and are reluctant to switch, said IDC.

“It’s still doing very well in the scientific area, as well as in large-scale manufacturing,” says Alan Freedman, research manager for servers and workstations at IDC. “It’s the organizations with the huge installed base that haven’t made the transition, while the organizations that are small or more agile are moving towards NT.”

unixTraditional Unix workstations are still found in departments devoted to engineering, mapping, geology and other technical applications. “The Unix market is not shrivelling up or fading away,” Freedman says. “But what we’re seeing now is that some of the mechanical and electrical design areas that were wholeheartedly Unix are now at least taking a look at the NT workstations.”

The reason they are looking at personal workstations has a lot to do with lower prices, increasing operating system reliability and the advances in processor and graphics technology. Independent software vendors have responded by porting many of their applications to Windows NT.

Backing up the capabilities of the personal workstation are improvements on the processor front. Of particular importance are Streaming SSMD Extensions, an innovation from Intel Corp. of Santa Clara, Calif. Similar to the MMX innovation, SSE gives the Pentium III the ability to better perform the floating point calculations needed for high-end graphics calculations.

Coinciding with the advances in processing are low-cost graphics cards which ease entry into the world of high-end graphics work. In the past, the customer had to spend $3,000 or more to get a graphics card with an on-board geometry accelerator but now there are cards that can do this for less than half that amount.

Able to leverage the power of on-board processors, users get graphic performance that scales with their processing performance. One of the prominent applications of workstations’ high-end graphics capabilities is in Geographical Information System (GIS) applications. Some of the major GIS vendors, such as ESRI Inc. of Redlands, Calif., are porting many of their products to the Windows NT operating system. Higher processing power, in conjunction with the latest graphics cards, allow for a more dynamic presentation of geographic information.

The newer graphics cards also allow workstation users to use two monitors on a single workstation. Graphics cards that make this possible include the Millennium G400 Series from Matrox Graphic Inc. of Montreal. Based on what Matrox calls the “DualHead Display,” the feature allows the user to extend one application across two monitors, or open multiple applications at once. Some users, the company says, display applications on one monitor while showing tool bars on the other.

Six months ago desktop PCs were equipped with 4MB video cards. Now users are getting 8MB or 16MB.

But while this is pretty powerful for the desktop level, hardware isn’t necessarily the only means of defining a workstation, argues Kevin Knox, a senior research analyst with the Gartner Group in Stamford, Conn. “I think workstations are defined more by the application than they are by the hardware,” he says. “Generally, workstations are systems optimized for a specific vertical application. It’s not just high-end, mid-range and low-end.

“I agree that the lines are blurred and there are vendors out there that play to that. I think their workstation numbers are inflated significantly because they are showing workstations in the desktop market.”

“The high-end market is flat to a small decline in terms of revenues, and a larger decline in terms of units because of NT,” says IDC’s Freedman. “However, some companies are coming down with lower-priced Unix workstations to combat that — most notably Sun Microsystems with workstations that are lower in price and target the same markets as the NT workstations.”

“So while Unix does not have the majority of the units, it does have the lion’s share of the revenue,” says Freedman. “We are predicting over the next four or five years, slight negative growth in units and a bit higher negative growth in revenue — about two or three per cent.”

Gartner Group reports that NT will eventually supersede Unix in the high-end market. The Unix versus NT operating system game has been playing for some time now, and vendors, which at one time clearly chose sides, no longer seem as sure of the winning team.

Not too long ago, the workstation market consisted of Sun, HP, IBM and SGI, but there has been a rapid penetration of Wintel systems, says Knox. “Sun is trying to protect its installed base, and frankly not doing very well on the low end,” he says. “They introduced the Darwin product and that really hasn’t taken off as I know they wish it had.”

What users are saying, he continues, is they have an office productivity machine for the everyday applications, and a Unix box, and they want to consolidate them into a single system. “Right now that’s the NT system,” adds Knox. He expects traditional PC vendors such as Compaq and Dell to take the lead in market share because of the improved performance of NT, Xeon processors and other technologies.

There are, however, still some markets that can only be served, at this point, by the robustness Unix delivers. Traditionally, high-end workstation markets have included mechanical computer-aided design (MCAD) and electronic design (ECAD) in industries as diverse as auto, finance and software design.

Changes in workstation market

The rise of the personal workstation has dramatically changed the face of the workstation market in Canada — at least in terms of vendors.

In 1997, according to IDC, Hewlett-Packard Co. was the leading vendor with more than 14,000 units shipped in that year. Second was Sun Microsystems Inc. with approximately 8,000 units shipped. Following Sun were IBM, Digital, Compaq, Dell and Silicon Graphics. Since that time, the Windows NT/personal workstation market has been growing at 15 per cent compound annual growth while the Unix market has been declining by a three per cent annual growth rate. Trends for both camps are expected to continue until 2003.

In 1999, 19,500 workstations were shipped in Canada. as much as 32.6 per cent of the market is now held by Dell.

Compaq follows at 23.7 per cent, then Hewlett-Packard at 21.6 per cent, followed by IBM at 14.7 per cent. Other workstations account for the remaining 7.4 per cent of the market, IDC Canada reports.

Risc machines no longer dominate

Three years ago, the workstation market was dominated by RISC (Reduced Instruction Set Computing) processor-based products running Unix operating systems and applications.

Today, several developments in this marketplace have allowed advanced application users to rely on other processors to provide comparable performance to a traditional workstation at a lower price.

A workstation-class system is a higher- performance computer specifically engineered for applications with more demanding processing, video and data requirements intended for professional users who need exceptional performance for computer-aided design (CAD), geographic information systems (GIS), digital content creation (DCC), computer animation, software development and financial analysis.

With the introduction of Pentium II processors, many computer companies expanded their product lines to offer Intel based workstations. The added performance provided by these and successive Intel Pentium III and Pentium III Xeon processors have resulted in a strong shift from proprietary, traditional workstations to branded personal workstations, which use the Windows NT operating system.

Workstation users benefit from rapidly evolving processor technology. High performance workstation-class systems let power users be more productive as projects can be completed much faster, saving organizations time and money.

The workstation market has been one of the first to benefit from the set of instructions incorporated into Intel’s Pentium III processors, called Streaming SIMD Extensions (SSE). This performance improvement will come from the new SSE-enhanced applications and drivers being introduced by hardware and softwar vendors.

Most branded workstations also provide the option to add a second processor, allowing users to takeadvantage of the multi-tasking and multi-threading capabilities of their applications and operating systems.

In addition to dual processor support, workstation-class products are differentiated by their options for advanced Open GL graphics adapters, greater memory and storage expansion, higher performance harddrives and application certification.

It is important to understand that all 3D graphics cards are not created equal. 3D video adapters can generally be categorized as those optimized for advanced workstation applications or those that are good for games.

OpenGL (OGL) support is the industry standard that separates the workstation from a gaming workstation.

Most of the workstation graphics glory goes to the high-end 3D video cards, but multiple monitors are also an important productivity tool for many workstation users. Two or more monitors can be of benefit tp those who require more display space for increased efficiency and effectiveness while multi-tasking.

For instance, multiple monitors can help software developers create and debug applications by having an application on one screen and a debugger on another, or a programming editor on one and an onlin reference manual on the other.

GNOME: GNOME Montréal Summit Starts Tomorrow

Vue du vieux-port et du centre-ville de Montréal

The annual GNOME summit starts tomorrow. Contributors are gathering from around the world for four days of discussion and working sessions. Scheduled topics include Wayland, Boxes, and the GNOME continuous build system.

Traditionally held in Boston, this is the 13th GNOME summit, and the second to be held in Montréal.

The summit is an informal event and everyone is welcome. For those who are interested, there will be a happy hour community meetup with the Montreal Linux community on Saturday afternoon. More information can be found on the wiki.

Thanks to CRIM for hosting the event and Savoir-faire Linux for sponsoring. Thanks also to Red Hat for sponsoring the Sunday social event, and to Canonical for providing our hungry hackers with tea, coffee and bagels.

GNOME: GUADEC Keynote speakers: Matt Dalio

dalio

The life of Matt Dalio changed when, at very young age, he lived for a year in Beijing, China.There he was able to discover difficulties suffered by many children, many of which were orphans.

Starting from this experience Matt decide to start the China Care Foundation an association that has raised over $14 million to provide support to special needs Chinese children.

But Matt has also a dream: to improve lives of millions with use of free software in his latest endeavour, Endless mobile.

Matt shared with us his visions and projects during his keynote talk, at GUADEC.

Q: Your life changed when you first went to China at the age of 11. Do you think that the spirit of cooperation inside the China Care Foundation is comparable in any way to inspiring free software communities?

A: When I was first learning about the free software community and talking with individuals in that community, I was struck by just how much we have in common. We all want to give free access to people who need it. We all believe in the power of software to unlock the potential in people.

China Care Foundation is very much a collaborative effort. In the years since I founded it, the network of individuals who contribute — from dollars to volunteer time to giving an orphan a true home – has grown immensely. Right now, in addition to individual contributors, China Care has clubs on 52 campuses around the United States; college kids collaborating to give live saving surgeries, foster care placements, and adoptive families to orphans in China. It has been incredible to see this network of people, from their respective places in life, working towards the same goal. There’s so much power in that.

Q: Tell us about your keynote at GUADEC.

A: Staggering statistic: 5 billion people on earth do not have Internet access.

We think that computers are everywhere, but they aren’t.  80 percent of the world does not have access. Isn’t that amazing? And yet you’d never know walking around our little corners of the world. What we don’t realize is that for all of the towns that we drive between and cities that we fly between, they are all pretty much part of the same little subset of the world. It’s like walking around on dry land and not realizing that 80 percent of life on earth exists under the sea. You wouldn’t know it unless someone told you about it.

My goal was to tell people about it. To give a vivid picture of what it looks like. To help people understand what the *middle* of the pyramid looks like. These are people who want computers. They have electricity. They are literate. And they have money. It’s not just that computers are too expensive for them. The real problem is that technology has never been built with them in mind. What does someone do when they live in a place that has no hope of getting Internet access?  What is a computer without the Internet?  It’s a Microsoft Word machine. So who would buy such a thing?  And yet that is 80 percent of the world.

The examples go on and on, of cases where you think about how what technology could be for someone in that market. It could be infinitely more powerful than it is for you or I, because that person is also lacking access to the basic necessities. There are not enough good doctors to give quality health. There are not enough teachers to give quality education. There are not enough good jobs. Yet a computer with the right applications can be answers to all of that. Just being able to search Wikipedia for Dengue Fever can be enough to save a life. Imagine what else you can do?  A link to Khan Academy or Code Academy is enough to change the direction of a life.

Technology has solved innumerable problems in the world, and yet the people who build technology don’t make a living of understanding what it means to people who do not have technology. So there isn’t really anyone building software for that part of the world, and those who do go way down to the bottom of the pyramid where there all sorts of other challenges.

My goal was to speak to the Gnome community about just how large of an opportunity this can be for Gnome. Billions of people are waiting for a computer. Waiting for an operating system that is built with them in mind. And with just a little bit of effort and a little bit of understanding, we can reach them.

Q: What did you expect from GUADEC?

A: I certainly did not expect what I got. The response to my talk was overwhelming in the volume of support. Goodness, what a community. Plus, it was just such a great community of quality human beings. Really, I am proud to call it a part of my life.

If you missed Matt’s talk at GUADEC, read more on the Endless Mobile webpage!

Banshee: Banshee 2.9.0 released!

Banshee 2.9.0 has been released!

Banshee 2.9.0 is the first release in the 2.9 development series leading up to 3.0.
Read the release notes for more info.

BillReminder: Fanatics? The War Was Won!

I started out in 1994 as a Linux advocate, saying to myself, “This is great, but I wish it were easier to install and didn’t screw up my boot sector.” In 1995-1996, I forgot about it so that I could concentrate on applications. In 1997 I went into denial, and in 1998 I tried to remain objective.

In 1999, I’m taking a “who cares” attitude. I’m not denying Linux per se; I’m simply refusing to get caught up in an OS holy war.

It’s not 1999 yet, though, so I still have a little time left for some more denial–not just about Linux, but also about Windows 2000 and NetWare.

Linux fanatics out there, you’re going to have to get over this: In some aspects, Windows NT is a better operating system. The biggest NT advantage is that it has a development model and a ton of rich consumer-friendly applications.

lnWere that the only thing, Linux would be home free, since the mass acceptance of the operating system will spur on more applications. But Linux also has problems with its scheduler. I’ve written before that the Windows NT scheduler is not up to par to what is available on some Unix platforms (see PC Week, June 1, Page 71). However, NT’s scheduler makes Linux’s look like dog meat.

Another Linux problem is with I/O. An engineer I know says that Linux is rife with Ring 3 scaling problems. But he added that the operating system will “get there” soon enough.

The trouble with NT starts with its registry, which most engineers complain is a horrible mess. The only people who like the NT Registry are those who sell packages to “fix” it.

As bad as it is, the registry is the least of Microsoft’s worries. We have today a big need for 24-by-7 uptime, and NT just doesn’t cut it. NT doesn’t allow IT managers to gracefully kill rogue programs. It makes organizations reboot systems too often, even when minor, noncritical application-level changes are made. Sure, Microsoft and others have patches, kludges and fixes that let NT function in this environment. But corporations want guaranteed uptime; that’s why Linux is perfect here.

NetWare 5.0 should have been poised to reap profits from a delay in Windows 2000 and the newness of Linux. Unfortunately, there are a ton of problems with NDS (Novell Directory Services), including incompatibilities between NDS with NetWare 5.0 and NDS with NetWare 4.0 implementations.

There are also unconfirmed reports that NetWare 5.0 is slower than NetWare 4.0 in some instances. The performance problem stems from NetWare’s unithreaded TCP/IP stack. But really, these performance differences are so slight that it shouldn’t really make a big difference.

All this hand wringing is meaningless in a way. We in the press and in the community constantly operate in an “exclusive OR” world. That is, if something new comes along, we have to assume it will displace something else. But the buying practices of corporations rarely function in this way. Corporations buy to solve problems.

That’s why I see businesses forcing vendors to work together. The consumers will push Microsoft to accept Linux; they’ll push for development of stronger NT development (for example, Winsock) APIs on the Linux kernel. They’ll push Microsoft to accept NDS because consumers don’t plan to dump it.

Next year, though, Linux will be pushing other Unix vendors out of the market. The smartest move Novell could make would be to completely dump the NetWare code base and move all of the NetWare services to Linux. Caldera supports NetWare for Linux now.

Watch out for Caldera, by the way. In 1999, it’s going to make some Linux announcements that will knock your socks off.

Gtk2-Perl: Gnome2::VFS 1.082 available

Overview of changes in Gnome2-VFS 1.082: Avoid misusing the macro PL_na, thus preventing issues when Gnome2::VFS is used in conjunction with certain XS modules, among them XML::Parser and String::Approx. View the source in the Gtk2-Perl git repo at h…

Gtk2-Perl: Gnome2 1.043

Overview of changes in Gnome2 1.043: Avoid misusing the macro PL_na, thus preventing issues when Gnome2 is used in conjunction with certain XS modules, among them XML::Parser and String::Approx; Created %meta_merge, which is used to pass META_MERGE va…

Gtk2-Perl: Gtk2 1.248 available

Overview of changes in Gtk2 1.248 [2013-09-29]: Avoid misusing the macro PL_na, preventing potential issues with other XS modules. View the source in the Gtk2-Perl git repo at http://git.gnome.org/browse/perl-Gtk2/tag/?id=rel-1-24-8 or download the so…

Gtk2-Perl: Glib::Object::Introspection 0.016 (unstable) available

Overview of changes in Glib::Object::Introspection 0.016 (unstable): Add support for unicode character arguments to Perl callbacks; Avoid misusing the macro PL_na, preventing potential issues with other XS modules; Fix build on MinGW with dmake. View…

Gtk2-Perl: Glib 1.302 (stable) available

Overview of changes in Glib 1.302 (stable) [2013-09-29]: Avoid misusing the macro PL_na, preventing potential issues with other XS modules; Avoid memory corruption when registering boxed synonyms repeatedly. View the source in the Gtk2-Perl git repo a…

BillReminder: Corporate Acceptance Was Completely Necessary For Linux

For IT organizations such as West Virginia Network in Morgantown, which runs the network applications for the state’s higher education institutions on AIX Risc 6000, NT, and Intel platforms, “it would probably take a killer app to move us to Linux quickly,” says Jeff Brooks, lead systems programmer for WVNET.

On the desktop side, however, he says Linux is taking hold. “We have an increasing number of people running a Linux desktop environment for personal productivity, including mainframe programmers. IT professionals who are not supporting PC products are spending too much time maintaining their PCs. We calculated the time people spent doing non-business, non-mission-related maintenance–doing upgrades, or rebooting when that blue screen comes up on NT– and it’s not non-trivial.”

As an embracer of Linux, Java, and other things non-Microsoft, IBM, which dropped to Number 2 in the Software 500, grew its software revenue 6% to $11.9 billion, with total corporate revenue growing 4% to $81.7 billion. IBM, like other Top 10 companies Sun, HP, Compaq, and Hitachi, derives the bulk of its revenue from hardware sales.

IBM’s strategy for making money from software and services is a model the others are also following, each in their own way, says IDC’s Bozman. “If you look at IBM you see a model for how you can make more from software and services, but [IBM's offerings are] more marketable, more cross-platform. Look at Lotus and Tivoli.”

Like IBM, HP “realizes there is a synergy there in combining software sales and services.”

Sun, she says, uses software more as a lever to drive hardware sales. And unlike IBM, “Sun has a small professional services organization and doesn’t want to be concerned about competing with the Andersens, etc.”

Compaq entered the enterprise software fray with its acquisition of Digital Equipment, inheriting not only the software but Digital’s services organization. While Compaq so far has not been able to meld this acquisition as smoothly as it probably hoped, and recently replaced CEO Eckhard Pfeiffer with chairman and acting CEO Ben Rosen, analysts are positive about Compaq’s ability to move forward in the enterprise software world. “Compaq already has a lot of corporate IT accounts; they have extremely strong relations with the IT world. The ability to take DEC and integrate it is an obvious question mark, but you have to believe Compaq will make [DEC] an important part of the future,” says Tim Bajarin, president, Creative Strategies Inc, San Jose, Calif.

He adds, “Clearly they see enterprise-driven software as a key component to the overall value-added products they sell in the marketplace. The demand for complete systems solutions is on the rise, not the decline. I would be surprised if Compaq doesn’t get it right.”

linuxAlso pushing hard in the one-stop shopping realm is Computer Associates (#4), now busy absorbing its recently announced acquisition of Platinum Technology. CA in 1998 grew its software revenue 12% to almost $4.9 billion, with total corporate revenue growing to almost $5.1 billion. The combination of Platinum and CA, based on 1998 software revenue would be $5.6 billion, which would rank the combined company this year at #3, surpassing Oracle. Size seems to matter in the systems management market, as the industry consolidates into a smaller group of large suppliers. Collectively, companies in the Software 500 that compete in the systems/network management space grew software revenue an average of 16.2%.

In the enterprise applications arena, where both Oracle (#3) and PeopleSoft (#10) compete, “the key priority for ERP vendors is to extend their applications and frameworks to the world of e-business,” says Steve Bonadio, senior analyst, Hurwitz Group, Framingham, Mass. ERP will be the backbone that enables companies to efficiently and effectively communicate and collaborate with themselves, their customers, partners, and suppliers.”

Both Oracle and PeopleSoft grew at a good pace, with Oracle reporting a 20% growth in software revenue (which also includes their database business) to $5.3 billion. And PeopleSoft hit the billion dollar mark in 1998, with its 48% increase in software revenue. Both companies beat the average software revenue growth rate (17.4%) for companies in the Software 500 that compete in the ERP/enterprise applications market.

While the ERP suppliers are benefiting from a still-healthy demand for their solutions, corporate IT professionals still need to evaluate the health of their potential vendors, and the health of their strategy, says Hurwitz’s Bonadio. “As ERPvendors aggressively round of their product functionality and architectural strategies, companies using ERP applications need to make sure that their existing ERP investments are secure. There is nothing worse than spending millions and taking years to implement an ERP solution than to find out that you have to do it all over again.”

ERP was not the only software segment thriving in 1998. Among the Software 500, suppliers in both the Internet/e-commerce and data warehouse/OLAP markets saw software revenues rise an average of l7.8%.

Across all segments of the software industry, many companies grew through merger or acquisition, a trend that has continued as the industry matures. Among the Software 500, 32% merged with or acquired another company during 1998 (see “Here Today Gone Tomorrow,” pg. 28 for a look at what happened to some of last year’s Software 500 companies). While investment banks tend to say this is largely a positive trend for IT buyers, assuring the continuance of new and innovative products from startups and small companies, and enabling more one-stop shopping and more formalized support organizations, IT professionals are not so sure.

The supposed benefits of one-stop shopping “depend almost completely on the willingness of the vendor to try to fit our enterprise situation,” says WVNET’s Brooks. Some vendors, in his experience, “try very hard to lock you into multiyear contracts with little added value. Some conglomerates have offered us software they think is a great deal but that we don’t necessarily need. In that sense, the whole merger mania for us every year looks a little gloomier, because we have to deal with a smaller number of suppliers that give us heartburn. But then again, I’m not a stockholder.”

In addition, says Brooks, “we tend to lose relationships with developers we may have worked with for 10 years. All of a sudden there’s a layer of management between us and them–if they’re still there.”

Agreeing with Brooks is Lynn Manzano, Y2K project manager at Experian Inc., a leading supplier of information on consumers, businesses, motor vehicles, and property, in Orange, Calif. “You do lose a lot continuity, and you lose the depth of support.” On the other hand, she says, “Hopefully I’ll get a better price. With some of these bundled purchases we saved a lot of money, from a cost perspective. From a functionality perspective, I don’t know yet” the benefits from a merger or acquisition.

Another overriding concern for IT in 1998 was the Y2K issue. 1998 was the year the software industry and the IT community got serious about Y2K, scrambling to address millennium date issues before year-end, so 1999 could be spent testing. Among the Software 500, 89% of the companies reported that their primary software products are now Y2K compliant. Only 1% of the companies said they will not be compliant by year-end ’99, nor will they make it by 2000. And 10% of the companies did not respond.

The Y2K issue proved to be a double-edged sword for IT. On one hand, many organizations were forced to put off new development projects to concentrate on their millennium fix. On the other hand, Y2K has prompted a massive updating of legacy systems to, in many cases, new packaged applications.

Says WVNET’s Brooks, “In 1998 we were largely sidetracked by Y2K. Every application we were working with from mid-year 1998 through now had been with an eye to getting everything done for Y2K. The fringe benefit is massive updating. I think, architecturally, that’s clearing out a lot of deadwood. It’s really a new broom–inadvertently.”

And while it didn’t get quite the attention in the U.S. that the Y2K issue received, Europeans were busy grappling with the debut of the Euro currency, which observers say creates a more complex coding challenge than Y2K, as it affects fundamental business rules. Among the Software 500, which are predominantly U.S.-based companies, 50% reported that their primary software products are Euro compliant, while 6% said they are not compliant yet. Twenty-nine percent reported that Euro compliancy was not applicable to their software products, while 15% declined to answer.

With the Euro now launched, and Y2K soon to be winding down, what’s ahead for 1999 and beyond?

WVNET’s Brooks cites storage management as an area to watch–”the whole issue of integrating storage management across the enterprise, particularly the interoperability of storage and the prospects for data interchange on a rapid, secure basis.”

Says Mark Gorenberg, a partner in the venture capital firm Hummer Winblad Venture Partners, San Francisco, Calif., “Lower cost and plentifulness of storage is a huge trend for people in IT.”

Other trends, Gorenberg notes, include the management of the extended enterprise, the movement of ERP vendors to e-commerce, and the rise of vertical software markets. “Vertical markets are very much in vogue. It’s possible to create a vertical company as a standalone company now. Before, they couldn’t grow large enough.”

For Brooks, he is looking to the new millennium to bring simplification. “It seems that over the 25 years I’ve been following the industry, complexity grows at a fairly good clip until people refuse to tolerate it, then something comes out to simplify it. It used to be the desktop environment was fairly complicated, then Windows came out, and pretty much everybody’s PC looked the same for a few years. There has been another one of those [trends] with the Web, but it’s not done yet. I have a feeling our architectural issues in about four years will look very different. But I’m not seeing anything from the pundits that satisfies me about what could be next big thing.”

So is life e-beautiful for IT professionals today? Both WVNET’s Brooks and Experian’s Manzano agree there are lots of opportunities for IT professionals, both employment-wise and innovation-wise. And the software they have to work with keeps getting better. “The tools are significantly more compatible,” says Brooks. “You can have a toolbox and have some hope that most of them work together somewhat.” With more packaged products, he notes there are also fewer opportunities to provide solutions for users that the software vendors won’t, “but at the same time you can devote more time to doing other things, like developing new applications, or training, or doing production evaluation, and spend less time looking at code.”

Gorenberg adds, “There are a number of huge opportunities in IT. Outsourcing has created real capitalism for people in IT. Outsourcing and the whole service provider phenomenon are growing like gangbusters. For the first time venture firms are funding service companies, and those companies are growing and going public, making great IT people the new rock stars.”

GNOME Foundation: Minutes of the Board meeting – September 17th, 2013

wiki: https://wiki.gnome.org/FoundationBoard/Minutes/20130917

= Minutes for Tuesday, September 17th, 2013, 16:00 UTC =

== Next Meeting ==
* Tuesday, October 1st, 2013, 16:00 UTC

== Attending ==
* Andreas Nilsson
* Joanmarie Diggs
* Karen Sandler…

GNOME: GNOME 3.10 Released!

The latest update to GNOME 3, version 3.10, has been released. This release comes six months after the previous version, and includes new features, new applications, and many improvements.

Introducing the release, Allan Day (GNOME Design Team) said, ‟GNOME 3.10 is a significant upgrade for our users, and developers will benefit from new features in the application development platform. Our contributors did an incredible job and have created a really exciting release.“

window-selection-3.10

Highlights in this release include:

  • A reworked system status area, which gives a more focused overview of your system.
  • ‟Software“, which provides an easy way to browse and install applications.
  • A collection of new applications, including Maps, Notes, Music and Photos.
  • New geolocation features, such as automatic time zones and world clocks.
  • Hi-resolution display and smart card support.

You can find out more details about these features, as well as the many other improvements, in the GNOME 3.10 release notes.

GNOME 3.10 also introduces initial Wayland support. This represents a major technological step forward for GNOME, and will enable the project to fully adopt the next generation display and input technology in the future.

The GNOME Project is a member of the GNU Project, and GNOME 3.10 comes just days before GNU’s 30th anniversary. Speaking about the 3.10 release, John Sullivan, Executive Director of the Free Software Foundation, said: ‟the GNOME 3.10 release exemplifies what GNU is about — technical and ethical excellence. The Free Software Foundation is proud to showcase the GNOME community’s work when talking to potential new free software users, and as GNOME users ourselves, we’re very thankful for these new improvements.“

Further information and reactions can be found in the GNOME 3.10 press release.

GNOME: GNOME Foundation Announces GNOME 3.10

Orinda, CA– The GNOME Project is proud to release GNOME 3.10 today. The latest milestone release in the GNOME 3 series includes many new features, applications and bug fixes, as well as enhancements and updates to many existing applications.

“Days before the GNU System’s 30th birthday, the GNOME 3.10 release exemplifies what GNU is about — technical and ethical excellence,” said John Sullivan, Executive Director of the Free Software Foundation. “The Free Software Foundation is proud to showcase the GNOME community’s work when talking to potential new free software users, and as GNOME users ourselves, we’re very thankful for these new improvements.”

Highlights for GNOME 3.10 include:

  • Experimental Wayland support
  • A reworked system status area, which gives a more focused overview of your system.
  • Three new applications (which are technology previews): Maps, Music and Software.
  • Three new additions to the core set of GNOME applications: Notes, Photos and Weather.
  • ‟Software“, which provides an easy way to browse and install applications.
  • New geolocation features, such as automatic time zones and world clocks.
  • The ability to set a custom image on the lock screen.
  • High-resolution display support

For developers, there are new GTK widgets, a geo-location framework that will allow location aware applications, and the ability to define composite widgets using XML.

“GNOME has pioneered the development and support of code that is now core infrastructure for many diverse Free desktops, including DBus, accessibility support and Network Manager,” said Matthew Garrett, Linux kernel developer and security expert. “As a result, it’s unsurprising that GNOME is the first to ship with support for a next-generation display server in the form of Wayland. GNOME’s commitment to improving the underlying platform is vital to the future of Free Software and provides a service to the entire community.”

Users will see many changes in this release which allows greater customization than in previous releases, such as the ability to customize the background of the lock screen. Other changes include allowing app browsing using pagination instead of scrolling, fine scrolling in applications with precise movements, an enhanced and redesigned login screen, and improvements to user settings. Finally, the system status menu has been redesigned by consolidating many of the smaller menus including wifi, bluetooth, sound, brightness and power into a single drop-down menu providing quick easy access to all.

A new Software application will provide a GNOME centric consistent interface to installing and maintaining software regardless of the distribution you will use.  In the future, Software will be improved to include comments and ratings and other exciting developments to help choose the best software for your tasks.

For those using new hardware like the Chrombook Pixel, support for high resolution displays will ensure a consistent look independent of the resolution.

Allan Day, who was recognized at GUADEC as 2013′s most distingushed contributor said, “GNOME 3.10 is a significant upgrade for our users, and developers will benefit from new features in the application development platform. Our contributors did an incredible job and have created a really exciting release.”

The GNOME Foundation thanks all of the contributors for their hard work, perseverance, and vision in this release.

About GNOME

GNOME was started in 1997 by two then-university students, Miguel de Icaza and Federico Mena Quintero. Their aim: to produce a free (as in freedom) desktop environment. Since then, GNOME has grown into a hugely successful enterprise. Used by millions of people around the world, it is the most popular environment for GNU/Linux and UNIX-type operating systems. GNOME’s software has been utilized in successful, large-scale enterprise and public deployments.

The GNOME community is made up of hundreds of contributors from all over the world, many of whom are volunteers. This community is supported by the GNOME Foundation, an independent non-profit organization that provides financial, organizational and legal assistance. The Foundation is a democratic institution that is directed by its members, who are all active GNOME contributors. GNOME and its Foundation works to promote software freedom through the creation of innovative, accessible, and beautiful user experiences.

Development: GNOME 3.10 Released

GNOME 3.10 Released
===================

Today, the GNOME Project celebrates the release of GNOME 3.10. This
is the fifth major update to GNOME 3. It builds on the foundations
that we have laid with the previous 3.x re…

GStreamer: GStreamer Core and Plugins 1.2.0 stable release

The GStreamer team is pleased to announce the first release of the stable
1.2 release series. The 1.2 release series is adding new features on top
of the 1.0 series and is part of the API and ABI-stable 1.x release series of the GStreamer multimedia …