Let’s wrap up the year with a new release; tarballs are due on
Please make sure that your tarballs will be uploaded before Monday
A couple of days ago I was telling you about the possibility of Canonical forking GNOME Control Center and GNOME Settings Daemon for Ubuntu 14.04 Trusty Tahr. Well, this was confirmed by Robert Ancell, Technical Lead at Canonical, on the Ubuntu Desktop mailing list.The forks will have a limited lifespan and there won’t be any […]
The annual WebKitGTK+ hackfest is currently underway in A Coruña, Spain. Going into its fifth year, 2013′s event is the biggest so far, including an unprecedented 30 participants.
The four day event is focusing on everything web in GNOME and GTK+. WebKitGTK+ developer Claudio Saavedra reports that multiple web processes and user interface improvements to the GNOME web browser are both priorities for the hackfest.
You can follow all the action on the @WebKitGTK+ Twitter feed.
The event has been made possible by sponsorship of Igalia and the GNOME Foundation. Other sponsors are helping with travel costs, accommodation, coffee and snacks. Many thanks to Samsung, Adobe, Company100 and Cable Labs.
The second release of the GNOME 3.12 development cycle is here.
To compile GNOME 3.11.2, you can use the jhbuild  modulesets 
You can’t work in the information technology sector and not be touched by Linux in some way, even if it’s only in a debate about what role Linux might play in your organization.
Linux is an Open Source operating system developed by a team of programmers lead by Linus Torvalds and originally targeted at the Intel x86 platform. Linux is a UNIX-like operating system, but since it was written completely from scratch, it uses no original UNIX source code. On the positive side, that means there are no copyright restrictions on the code. The biggest disadvantage, however, is that the operating system isn’t built upon a base of heavily used code. It also doesn’t have all the features you’d expect from a production UNIX system, like large-scale storage management.
Red Hat recently introduced an Enterprise Edition that includes Computer Associates’ AcrServeIT. The tacit admission is there’s a 100 percent Open Source operating system that still leaves high-end users wanting.
Linux began in the early 1990s and has moved from Intel to Motorola, Alpha, Sparc, and other major platforms. Because of its broad base of programmer support and its ability to run software from the large GNU software library, Linux is a viable alternative to both commercial UNIX servers, and Microsoft Windows and NT client machines. Ease of use–a common concern with UNIX–is less of a problem with two widely available desktop environments, KDE and GNOME, both of which are easy for Windows or Macintosh OS users and developers to learn. Both environments support a variety of themes that affect the appearance of windows, buttons, and scrollbars, letting users adopt a Windows or Macintosh look and feel.
To be a serious contender in the e-business market, Linux needs to offer more than a pretty desktop, and it does. First, it’s becoming the operating system of choice for high-end server vendors, such as IBM, Silicon Graphics, and Compaq Computer. Other vendors, such as Dell Computer, offer Linux preconfigured on its servers. Customer support is available from hardware manufacturers, as well as from Linux distributors like Red Hat, Caldera, and SuSE.
Also, Linux tools abound. The most popular Linux distributions include web and FTP servers, as well as other Internet applications such as Gopher, Domain Name Services (DNS), Mail, News, Proxy, and Search servers. The breadth of applications available with the widely supported, inexpensive operating system makes it an ideal candidate for web server applications. Industry analysts report more than 30 percent of web servers run on Linux.
Apache: Web server of choice
The Apache Web Server is a freely distributed HTTP server developed by the Apache Group and managed by the Apache Project (http://www.apache.org). Evolving from the National Center for Supercomputing Applications (NCSA) web server developed at the University of Illinois, Apache has become the most popular web server. According to a Netcraft survey of more than 13 million web sites conducted in March 2000, (http://www.netcraft.com/survey), Apache is used by 60 percent of the respondents. The next most popular server, Microsoft IIS, came in at just under 21 percent.
Popularity, though, is hardly the only consideration in judging e-business software. Scalability, reliability, and integration with other applications is crucial. Apache has been a stable platform on UNIX for some time, but Windows implementations haven’t proved as reliable. The Apache Group’s recently announced Apache 2.0 includes better support for non-UNIX platforms, which should improve Windows stability.
On the performance side, Apache 2.0 supports a hybrid multiprocessor/multithreaded mode that promises to improve scalability, though more real-world use is required before we know how well it meets that promise. The Apache Software Foundation supports projects focused on integrating Apache into the larger e-business environment, including XML-Apache, Java-Apache, and Java Servlet and Java Server Page support. The mod perl project provides developers the tools to create Apache modules in Perl that eliminate the need to run CGI scripts in a separate process, thus avoiding costly process instantiation.
Web servers also must provide access to middle-tier business services, especially those driven by database applications. Apache is recognized as a viable HTTP server by high-end web applications servers, such as the Oracle Application Server and IBM WebSphere.
Like Linux, Apache’s wide market acceptance demonstrates that Open Source development can create an essential tool for e-business. The release of Apache 2.0 (which was in alpha testing at press time) further shows that Open Source software can keep up with the demands of changing needs. However, while Apache is a solid choice for UNIX and Linux platforms, until a stable Windows version is available (and perhaps it’s close), Microsoft IIS is probably a better alternative for NT.
Perl: Portable programming
While C/C++ is a portable programming language, the learning curve and time required to develop fully-functioning applications is oftentimes prohibitive. While Java has eliminated some of the most (potentially) problematic aspects of C–especially pointers–and offers a rich set of libraries like C/C++, development is still coding-intensive. The Perl programming language has emerged as the programming tool of choice by as many as one million developers, according to The Perl Journal.
Perl was originally used as a systems administration tool on UNIX platforms, but was quickly adopted for Web development and data-intensive programming because it includes many features found in other tools, such as C, awk, sed, and BASIC. According to http://www.perl.com, Perl is the most popular web programming language, due in large part to its powerful text manipulation. From an e-business perspective, the fact that Perl runs on so many platforms and can handle operating system administration tasks, as well as more traditional database-oriented tasks, makes it an ideal candidate for a development tool. Since Perl applications can be written to run as embedded modules in Apache, web server processing can improve as much as 2,000 percent.
Can Perl pass the muster in an e-business environment? Amazon.com and Deja.com are two of the major dot-coms that use Perl to run their sites. In addition to Apache integration, the nsapi_perl module embeds a Perl interpreter in a Netscape web server so e-businesses don’t have to trade the faster performance of an embedded interpreter if they don’t use Apache.
The Perl language is also widely supported by third-party developers, offering more than 400 modules at the Comprehensive Perl Archive Network (CPAN) (http:// www.perl.com/CPAN/). Perl has been ported to major UNIX platforms, as well as Microsoft Windows and NT, and Macintosh. Also, Perl can easily integrate with databases through DBI modules, making it an ideal tool for creating applications from database-centric web pages to data warehouse extraction, transformation, and load scripts.
While popular, the Perl syntax can be cryptic. This is understandable given its origin as a systems management- oriented tool, but this limits its use for large-scale software development. Python, an object-oriented scripting language, is a better choice if you’re looking at larger applications where reuse and object orientation will pay off.
Databases: The weak link
When we think of e-business and databases we generally think of the big names: Oracle, IBM DB2, Microsoft SQL Server, Sybase, and Informix. Occasionally, we’ll hear about the two most popular Open Source offerings–MySQL and Postgres–but not too often. Why not? Those offerings simply can’t compete with the features and functionality of today’s commercial relational database management systems. In some cases, the lack of features makes the Open Source databases unusable in an e-business environment.
For example, MySQL’s lack of subqueries and right outer joins require developer work-around solutions. More seriously, the database’s lack of transaction support completely eliminates it as a serious contender for e-business. The transaction support found in major commercial offerings makes it possible to define a logical unit of work consisting of a series of steps that must all occur for the transaction to be completed. For example, transferring funds from your savings to your checking account may consist of two distinct steps–withdrawing money from savings and then depositing it into checking. Without a way to group those two steps, it would be possible for the withdrawal to be made without the corresponding deposit. That could occur if the server crashed in the middle of the operation, and no server is immune to crashing or other problems that could disrupt a transaction.
PostgreSQL, originally developed at the University of California at Berkeley, has many of the features found in commercial databases systems, including transactions, stored procedures, and extensive SQL support. If you’re looking for an Open Source database, this is probably the best. However, you must remember there’s much more to a database than what meets the programmer. If you need to consider integration with enterprise resource planning systems, failure recovery, parallel query optimization, advanced partitioning options, support for data warehousing (such as summarization and query redirection), then the commercial offerings are your best bet.
Conclusion: No simple answer
Can an e-business succeed with Open Source? Of course. TCP/IP and other core Internet protocols were developed in an Open Source environment. The key to success is to not blindly ignore or embrace Open Source any more than you would a particular vendor’s offering.
Open Source offers strong programming tools, Web servers, and a popular operating system that’s making steady inroads into production environments. In terms of databases, like your Web server, the database must be reliable, scalable, and easily integrate with your other systems. If you stick to the most widely supported Open Source solutions, including Linux, Apache, and Perl–to name just three–you can build a stable, reliable e-business platform.
Linux isn’t as established as UNIX, and it may not scale or promise the kinds of uptime you expect from UNIX, but for small- and mid-sized web sites, it may fit the bill. Apache and Perl both have strong developer support and in key areas, such as database access and performance, they get as much attention as any commercial product would.
For real-world, large-scale e-business, Open Source mixed with commercial applications is the best approach. While making significant inroads, Open Source still can’t marshal the resources to keep up with changes in technology. Red Hat’s alliance with Computer Associates for high-end storage options is a case in point. While advanced storage applications and robust databases may emerge from Open Source development, there aren’t any now–and we need them now.
Here comes GNOME 3.10.2, the second update to GNOME 3.10, it includes
For more information about the major changes in GNOME 3.10, please
The GStreamer team is pleased to announce a new release of the stable
Many modules got new fixes, translations, and documentation updates;
Tarballs are due on 2013-11-11 before 23:59 UTC for the GNOME 3.10.2
Blogs.gnome.org now uses WordPress 3.7.1. All plugins and themes have been updated to latest versions too. Enjoy!
GNOME 3.12 development cycle is starting up with this 3.11.1 snapshot. Lots of new features are still being proposed and discussed . To compile GNOME 3.11.1, you can use the jhbuild  modulesets  (which use the exact tarball versions from the official release).  https://wiki.gnome.org/ThreePointEleven/Features/  http://library.gnome.org/devel/jhbuild/  http://download.gnome.org/teams/releng/3.11.1/ The release notes that describe the changes between 3.10.1 and 3.11.1 are available. Go read them to learn what's new in this release: core - http://download.gnome.org/core/3.11/3.11.1/NEWS apps - http://download.gnome.org/apps/3.11/3.11.1/NEWS The GNOME 3.11.1 release is available here: core sources - http://download.gnome.org/core/3.11/3.11.1 apps sources - http://download.gnome.org/apps/3.11/3.11.1 WARNING! WARNING! WARNING! -------------------------- This release is a snapshot of early development code. Although it is buildable and usable, it is primarily intended for testing and hacking purposes. GNOME uses odd minor version numbers to indicate development status. For more information about 3.11, the full schedule, the official module lists and the proposed module lists, please see our colorful 3.11 page: http://www.gnome.org/start/unstable For a quick overview of the GNOME schedule, please see: http://live.gnome.org/Schedule Enjoy, Javier Jardón Cabezas GNOME Release Team
This could be described as the year of the operating system. Microsoft Corp. is racing to finish Windows 2000 before 2000, IBM and Santa Cruz Operation (SCO) are forging a new version of Unix for Intel 64-bit processors and Compaq, Sun and Hewlett-Packard are toughening their proprietary Unix versions.
For an IS manager it’s an embarrassment of riches – and a time of confusion. The expected lock-down of systems starting this summer due to Year 2000 concerns may give users a few months of breathing room and time to compile a list of questions.
Among them: Is it worth waiting for the much-delayed Win2000 or should we stick to NT Server 4.0? Will Win2000 be as reliable as Unix? Will the Data Centre edition be worthy of a data centre? And, as always, is Windows’ alleged cost of ownership advantage just window-dressing?
“NT versus Unix? I get this question from clients two times a day,” says Tom Bittman, vice-president and research director at the Gartner Group in Stamford, Conn. With more mission-critical applications coming out, NT supporters want them added to the IT mix. But managers wonder if Microsoft is up to the task.
“In almost all cases they’re leaning towards Unix,” said Bittman, “and wondering if they’re crazy.”
But Bittman warned NT is “probably two years behind the hype.”
“NT still has issues of scalability for the majority of the market, and that’s one reason many companies are going with Unix.”
In fact, he added, Unix is undergoing a bit of a come-back.
“We’re definitely seeing the start of a slow-down in the server space. A lot of companies are realizing Unix still has a place, and that NT is still missing some attributes. The belief is they may be fixed in Windows 2000. But while they’re waiting, that puts more of a focus on Unix for certain mission-critical and back-end applications.”
Microsoft’s ambitions are high: the Data Centre edition will scale up to 16 processors and 64 GB of memory, said Erik Moll, Windows 2000 marketing manager for Microsoft Canada. The Advanced Server edition (now called Enterprise Edition ) will go up to four-way symmetric multi-processing. Both will have advanced clustering and load-balancing capabilities.
Other features will include the Active Directory, a management service with file-level encryption and support for Public Key Infrastructure, and Web and Internet support services.
“There’s no question as we move forward to Windows 2000 one of our key focuses will be on reliability and availability,” said Moll. “We’ve greatly reduced the number of reboots as you do reconfigurations of your system, and we also worked hard to make sure device drivers have gone through compatibility testing.”
Until Win2000 is proven, however, users are looking at NT.
For Bittman, Unix has it over NT for scalability, mixed workloads, high availability and reliability, maturity of applications and the availability of people skilled enough to run mission-critical deployments.
Expect more than 200 concurrent users on your system? Don’t use NT, he advises, unless perhaps you’ve got SAP R/3. Gartner has seen SAP implementations with 850 users, not big by Unix standards, but impressive considering most applications on NT can’t handle more than 200 users. It speaks to how well SAP has been tuned for Windows, said Bittman.
However, he added, the majority of the market would need about 400 users, and that’s where Win2000 is headed.
As scalability declines as an issue, high availability and reliability will become more important, said Bittman. Microsoft’s “Wolfpack” clustering technology is two years behind Unix, said Bittman, but will be better in Win2000.
“Our biggest concern is NT reliability,” he said, “and it’s not getting better. In fact, I have clients who tell me in their view, every release of NT has only gotten worse. I don’t think that’s true, but there’s that perception out there.”
The biggest issue is memory leaks. Microsoft acknowledges the problem, but most will be plugged only in Win2000. Meanwhile, Gartner tells clients to expect an NT system will go down at least once a week. For companies who can tolerate that kind of performance, he said, NT will be good enough.
However, Ritchie Leslie, director of Montreal-based DMR Consulting Group Inc.’s Microsoft strategic alliance, believes NT has a big place in the enterprise.
“For medium-scale applications, say transaction systems supporting a few hundred users, Windows environments have a clear total cost of ownership advantage over Unix environments,” he said. “NT is cheaper to buy, most organizations have NT servers so if they can avoid having to buy a specialized Unix box to run an application, they can reduce the total cost.”
There have been “huge advances” in NT management tools, he said, adding the operating system is catching up in stability and reliability.
“For all but the very largest applications, Windows NT is becoming an increasingly practical platform,” he said. But, he added, it’s not ready for a large data warehouse or large transaction-based system.
Bev Crone, who watches both platforms as general manager of midrange system sales for IBM Canada Ltd., acknowledges that NT/Win2000 use will continue to eat into the Unix market. “At first blush,” he said, Unix looks more expensive than Microsoft’s offering, but not when the user considers factors such as reliability, availability and scalability.
Unix vendors aren’t standing still, he added, pointing to the IBM Unix collaboration on Project Monterey for Intel chips.
But Bittman is skeptical. Santa Cruz, Calif.-based SCO Inc. hasn’t set sales records with UnixWare, its Intel offering, he said.
Vendors are only now realizing that the first version of the IA-64 chip will create a small market, which won’t grow until the next generation of processors, called McKinley, debuts.
While some analysts believe Microsoft is aiming for an October release of Win2000 Professional, Server Edition and Advanced Server, Bittman says that will only be for “bragging rights.” He expects it will come out next year and will be filled with bugs that weren’t caught during beta testing, because users are telling him they aren’t testing it with mission-critical apps.
“We’re telling clients don’t deploy it in a production mode for at least six to nine months after general availability, after at least one service pack and maybe two,” Bittman said. “We’re also telling them it will be less reliable than NT 4 with service pack 5 for a year.”
Anyone marked as developer in Bugzilla can now:
So if you have some person who you want to give Bugzilla permissions,
= Minutes for Tuesday, October 15th, 2013, 16:00 UTC =
== Next Meeting ==
== Attending ==
I just published our schedule for 3.11/3.12 on our wiki, it’s
[An ics file is also avai…
The deadline for the next release of Gtk-Perl modules will be Saturday, November 23rd 2013 at 00:00 UTC. Please see the mailing list post ( http://bit.ly/161TZH0 ) for more information.
Overview of changes in Gnome2::Vte 0.11 [2013-10-22]: RT#89113 – Use v2 of CPAN::Meta::Spec in Makefile.PL; this adds the “nice bits” to MetaCPAN (git clone URL, project homepage URL, etc. Thanks to Gabor Szabo for the bug report. View the source in…
Overview of changes in Gnome2 1.044 [2013-10-19]: RT#89191: Fix FSF’s address; RT#89188: Make enums.pod depend on Gnome2 directory creation. View the source in the Gtk2-Perl git repo at https://git.gnome.org/browse/perl-Gnome2/tag/?id=rel-1-04-4 or do…
Overview of changes in Gtk3 0.014 [2013-10-18]: dist.ini: document ‘is_trial’, sets ‘testing’ in metadata; Add MetaJSON, set MetaYAML version = 2; Add [MetaResources] block with correct URLs; fixes RT#89118. The metadata changes have populated the Gtk…