One of the MANY nice things that is coming in APEX 5.0 is the native integration of the jQuery based FullCalendar, known in the APEX 5.0 Builder as the CSS Calendar. While the native integration will be nice, you shouldn't have to wait for APEX 5.0 for this functionality.

Enter the Enkitec FullCalendar plugin. We integrated FullCalendar into our website quite some time ago to enable us to display our Education and Conference calendars. Recently I've taken the basis for that code and created a plug-in that works with APEX 4.1 and 4.2.

The plugin allows the developer to decide the following:

  • The jQuery UI Theme to be used to render the calendar
  • What the header of the Calendar contains and where
    • Navigation
    • Title
    • View
  • Which calendar views, in addition to MONTH, the end user sees (Week and/or Agenda/Day)
  • Whether or not the day is considered an "All-Day" event
  • Styling for each individual event
    • Either event_color and text_color or a CSS Class to be assigned.
  • Tool Tip text to display when an end user hovers over the event in the calendar
  • Hight & Width of the Calendar
  • Which day should be considered the first day of the week

While there are still things I want to do to this plug-in, I thought I’d go ahead and release it so that the APEX community could start using it now.  

Will the plug-in continue to be developed once 5.0 is finally released? I suppose that depends on whether the APEX team implements the same of better functionality than what I’ve managed to do so far. 

In the mean time you can download the plug-in from here:

If you're going to Oracle Open World and will be there for the ODTUG Sunday APEX Symposium, then you're in for quite an interesting set of sessions. When discussing what would be a good topic for this year's set of presentations, we wanted to focus on an area within APEX from which everyone could gain some benefit. Although there are many different topics we could have addressed, the one that kept coming to the forefront, and the one that we knew everyone would have some interest in was generating printable PDF's from APEX.

If you've been in the APEX world for any length of time, you know that "Printing Ain't Easy".  Although there are a number of potential solutions to the print problem they all come with their benefits and caveats. This years symposium is aimed at presenting four of the possible solutions, how to implement them and the features, benefits and drawbacks of each. 

Here's the Schedule:

 All presentations will take place Sunday, September 22 in Moscone West - Room 2005.

If you interested in generating PDF reports from APEX, then these sessions are not to be missed.

KSCOPE 13 is coming up quickly, and so is the deadline for early-bird registration.

If you haven't ever been, KSCOPE is, dare I say, the best technical Oracle conference in the country, and potentially the world. Especially if you're interested in Oracle APEX. Along with excellent in-depth APEX coverage the agenda include coverage of: 

  • ADF & Fusion Middleware
  • Developer's Toolkit
  • Oracle Core Database 
  • .NET
  • Business Intelligence
  • Essbase
  • And Many, MANY more…

The conference runs from Saturday, June 22 though Thursday June 27, and will give you an opportunity to wish me a Happy Birthday on June 23. 

If you register on or before March 25th, you can save $300 on registration!

Do yourself and your career a favor and get to New Orleans this year for KSCOPE!

I hope to see you there!!



Those who know me probably know I've been a Mac convert for a number of years. At home I have several Macs for various purposes, but at work I have a MacBook Air and a 27" Cinema Display that I use to do most of my work.  

I've always had a problem with the fact that the Mac menu bar always stayed hovering down on the main screen of the Mac, forcing me to break concentration from my work, move my mouse down to the lower screen, click the menu and then navigate back up.

Yes, I know that you can change the position of the menu to which ever one you want, but that doesn't solve the problem completely. What if what I'm working on is in the bottom screen and I've moved the menu bar to the top.

MenuEverywhere to the rescue!

I stumbled across this little gem on MacUpdate while searching online for a solution to my mad mouse scrolling problems. 

MenuEverywhere give you lots of options including duplicating the menu bar on secondary screens, attaching an apps menu bar to the app itself (a la WINDOZE), allowing a pop-up menu bar based on a shortcut or button, etc.

It also give you full control over look and feel, fonts, styles and so on.

Now, I don't have to break concentration to get to the working menu that I need.

All is now right with the world again.


Since joining forces with Enkitec, I've managed to stay blissfully unaware of some of the things that they're best known for. Namely being top of the food chain when it comes to knowing what's what about Exadata.

 However, that's about to change!

A couple of the larger clients we worked with while we were still Sumneva have become so successful that they're working to migrate their systems, including their APEX applications, to Exadata platforms.

Now this in itself isn't a big deal. We all know that APEX runs inside the database and that it doesn't care what the underlying hardware platform is. Exadata machines run Oracle, so it's a no-brainer really.

The fun comes in when you start to think about the Exadata Secret Sauce, which in a nut shell has everything to do with the storage nodes and very little to do with the server nodes running the database. Those storage nodes are what provide the extreme speed and scalability to the Exadata platform and therefore anything that runs on them.

Understanding the underpinnings and how they can be used to help system performance can be extremely useful when taking a system that runs in a traditional oracle environment and migrating them to Exadata. Since most people are running APEX in a traditional environment, the data structures and queries have been written based on that knowledge.

And so starts my education about the technology behind Exadata and how differently (if at all) things need to be done in terms of database and query structure.

Along the way we're looking at writing a few tools/products that are aimed squarely at the Exadata world, covering things from sizing to migration to performance and beyond.

I'll be sharing what I learn about Exadata with regards to its relation to APEX et al as I move forward. 

Exciting times!

Quite some time ago on our web site, I wrote a quit tip about How to Warn a User That the Form Has Changed. The gist of the post was this:
"It's a common problem. A user spends time entering data into a form and then, for some reason, clicks a button or tab that will navigate away from the form without saving his data. Wouldn't it be nice if there were a way to warn the user that the data hasn't been saved, and that they may lose their work? "
The original solution used JavaScript to do the following:

create and set a flag at the page level
register an onchange event against all of the fields in the APEX form that sets the flag any time a value is changed
register with the page onbeforeunload event to check to see if the flag had been set and pop-up a warning if it has been

The original script worked well for a standard form but I had never tested it agains a tabular form.  Well, someone else had, and let me know that it really didn’t work.

As you can imagine, there are a lot of moving parts in a tabular form, especially now with APEX 4. The original script would have registered onchange events against every item on the form whether it was user editable or not.  That obviously would’t work for a tabular form as there are some JavaScript events that fire and change some of the hidden items.

After thinking about this for bit, I finally realized that instead of registering onchange events against every item on the tabular form, only the items that are user editable should be registered. If you look at the editable items, the id of each will conform to the following format:  f99_9999

So the key was to register only those items who’s id conforms to that format.  To do that the JavaScript match function and regular expressions can be used for this purpose.

The following JavaScript is the key. You can include it either in the HTML HEADER of the page, or in an HTML Region on the page:

<script type="text/javascript">

function onChangeinit() {

// This function sets up those fields which should
// trigger the "Are You Sure" Popup box upon navigating
// away from the page.

// First set up the array of elements in the form
// This method uses JavaScript to create an array of elements.
 var fields = document.getElementById('wwvFlowForm').elements;

// Now loop through the array and see if the id matches the format f99_9999
// If it does,  assign the on-change event.
// The onchange function sets the value of a JavaScript variable
// to '1' to show that something has changed.

 for (var i=0; i<fields.length; i++)

   if (fields[i].id.match('^f[0-9]{2}_{1}[0-9]{4}$'))
     $x(fields[i]).onchange = function () {window.unsaved=1;}

// Now Set up the UNSAVED variable
// and the function that checks it on UNLOAD.
window.unsaved = '';
window.onbeforeunload = function() {
return window.unsaved ? 'There may be unsaved changes to your data.' : undefined;

// And you're done.

The onChangeinit JavaScript function needs to be run whenever the page loads. To do that, we call it by placing the following JavaScript in the Execute when Page Loads attribute at the page level.


Finally, we want to provide a way to short circuit this for instances where we want them to be able to press a button without getting the message. The most common example of this would be the SAVE button. Again, you can include it either in the HTML Header of the page, or in an HTML Region on the page.

<script type="text/javascript">

function preSubmit() {

// Call this before any action where you want the user
// to be able to navigate without the warning message.


The last thing to do is edit any buttons that we want to be able to submit the page without being warned and make sure they call the preSubmit JavaScript function.

To make this change to SUBMIT buttons, edit the button and do the following:

In the Action When Button Pressed region, change the Action to Redirect to URL
in the URL Target enter the following:  javascript:preSubmit();apex.submit('SUBMIT');

Make sure the that ‘SUBMIT’ in the code above is actually the name of the button that you’re editing. That way when the apex.submit JavaScript function is called, it will be as if the user pressed the button.

For the DELETE buttons, edit the button and do the following:

In the Action When Button Pressed region, change the Action to Redirect to URL
in the URL Target enter the following:  javascript:preSubmit();apex.confirm(htmldb_delete_message,'MULTI_ROW_DELETE');

Again, make sure the that ‘MULTI_ROW_DELETE’ in the code above is actually the name of the button that you’re editing.

Once all of these things are put together, this should work nicely with tabular forms.

Hope this help some of you out, and as always comments are encouraged.

At Oracle Open World 2012, Scott Spendolini and I did a presentation entitled “Developing Commercial APEX Applications”. One of the topics that seemed to get a lot of attention and a number of questions was automating our build process using ANT. Several people requested a copy of the ANT script, so instead of sending it out individually, I’ve decided to include it here and walk though it.

First of all I must confess that I am, by no stretch of the imagination, an expert in building ANT scripts, so there are likely better ways to do some of the things that I’ve done. If you find that to be the case and you care to share your knowledge and experience, I invite you to share that knowledge in the comments section.

Second, and this may go without saying, but you’ll need to have the ANT executables installed on the server you’re going to use to build your projects.

Here is a link to download the entire file, but lets go through it piece by piece.

<?xml version="1.0"?>
<project name=“SERT BUILD" basedir=".">

The above preamble basically sets the XML version you’re adhering to and gives the whole project a name and sets the base directory for any work that is to be done.

<!--The following TASKDEF is needed to include the ANT-CONTRIB libraries -->
<taskdef resource="net/sf/antcontrib/antlib.xml"/>

The ANT-CONTRIB library is an extension to the basic ANT commands. It gives you the ability to do things like FOR and FOREACH loops, IF statements, TRYCATCH and a few others. I use it to loop through lists of files and performa actions on them. Not absolutely necessary, depending on what you’re doing, but very useful.

You can find more info about the ANT-CONTRIB library and download it here.

<!-- Define a macro that will be used to wrap the PL/SQL files -->
<macrodef name="wrapfile">
 <attribute name="file" />
   <!-- The first step wraps the file with an output name of *.*.tmp -->
   <exec executable="wrap">
    <arg value="iname=@{file}" />
    <arg value="oname=@{file}.tmp" />
   <!-- The second step moves the *.*.tmp file back to the original file name-->
   <exec executable="mv">
    <arg value="@{file}.tmp" />
    <arg value="@{file}" />

Above is a macro definition that I wrote to wrap PL/SQL or SQL scripts. When you define a macro, you can then call that definition later in the script to perform repetitive actions. This is much like defining and calling a PL/SQL Function for repetitive tasks.

For this to work you actually have to have the Oracle directory where the WRAP command resides in the path of the user that will be running the build. As a word of warning, I found out that the version of WRAP that comes with the Oracle Client for some reason didn’t work. I had to have a full install of the database software (even if I didn’t have a database created) for the build to work properly. I didn’t spend a lot of time tracking this down, but I suspect that it had to do with some library or something that was missing or inaccessible do to some path being out of whack.

<!-- This target sets up the base properties-->
<!-- These items can be changed by using the -D option on the command line -->
<target name="properties">
 <property name="wd" value="./work" />
 <property name="sv_version" value="000000"/>
 <property name="sv_parse_as" value="SV_SERT_APEX"/>

This is the first of the “Targets” in the file. A Target is basically something that will (either conditionally or unconditionally) get executed as part of the build. The properties target defines the variables that are going to be used during the build process. In my case I have the following:
  • wd - The working directory where the build will take place.
  • sv_version - The version of the software we’re going to build. It defaults to 000000, and will most likely be passed in on the command line.
  • sv_parse_as - The parse as schema that will be used to replace placeholders in the scripts. Again, can be overridden on the command line.
You can replace any of these values on the command line by using the -D option. Well talk more about how to call it from the command line later.

<!-- This target deletes the working directory and then recreates it.-->
<target name="setup" depends="properties">
 <echo message="Deleting the old working directory..." />
 <delete dir="${wd}" />
 <echo message="Creating the new working directory ..." />
 <mkdir dir="${wd}" />

The setup target sets up the working directory by cleaning it up (deleting it) if it already exists then creating it so that we can actually do some work in it. You’ll notice in line 2 the target tag has a depends attribute that tells ant that the properties target should be executed before this target runs. The depends clause is a nice safety net to make sure that all dependancies have been met before executing the current target.

<!-- This target does the SVN Check out -->
<target name="co" description="Checks out the most recent source to the working directory" depends="setup">
 <echo message="Checking out the sumnevaSERT Repository to the local working directory" />
 <exec executable="svn" dir="${wd}">
  <arg line="co --username myUsername --password myPassword”/>

The co target is used to check all of the current source files out of the source code repository. In our case we’re using SVN so the command-line version of the SVN client needs to be installed on the build machine in order for this to work. You’ll see on line 4 we’re setting up the executable (svn) and telling and the working directory. The next line is providing the command line arguments to the svn command. Obviously the repository URL and the Username and Password are fictional. You’ll need to fill those in for yourself.

One thing to note. I started off using my personal credentials for the SVN repository in the script. But because the script ended up being on several machines, we decided to create a “build” user that had read only access to the repositories to use in the build scripts.

<!-- This target does the replacement of the @VERSION@ variable in all the files -->
<target name="replace_all" description="replaces all instances of @SV_VERSION@ with the current build version" depends="setup">
 <replace dir="${wd}" token="@SV_VERSION@" value="${sv_version}" summary="yes" />
 <replace dir="${wd}" token="@SV_PARSE_AS@" value="${sv_parse_as}" summary="yes" />
<!-- This target does the replacement of the @VERSION@ variable in all the files EXCEPT the APEX application-->
<!-- It’s used to create an install file that maintains the @SV_VERSION@ Tags in the app, so we can move development to a new server -->
<target name="replace_clone" description="replaces specific instances of @SV_VERSION@ with the current build version" depends="setup">
 <replace dir="${wd}" token="@SV_VERSION@" value="${sv_version}" excludes="${wd}/app/*" summary="yes" />

The previous two build targets( replace_all and replace_clone) walk through the code tree and do replaces in all the source files, searching for the token listed and replacing each occurrence with the value provided. The first target replaces all occurrences, wherever they’re found, of both the @SV_VERSION@ and @SV_PARSE_AS@ variables. this would be used in a production build. The second target is used to create a clone of the code that would be in the development environment just in case we had to move it to a new server, which we had to do a couple times do to moving platforms, DB versions and APEX versions.

<!-- This target wraps all of the appropriate PL/SQL FILES-->
<target name="wrap" description="Wraps the appropriate PL/SQL files" >
 <echo message="Wrapping all of the PKB files found in the working directory" />
 <for param="file">
  <fileset dir="${wd}/TRUNK" includes="**/*.pkb" />
   <wrapfile file="@{file}" />
 <echo message="Wrapping PL/PDF Certification Key" />
 <for param="file">
  <fileset dir="${wd}/TRUNK" includes="**/plpdf_cert_b.sql" />
   <wrapfile file="@{file}" />
 <echo message="Wrapping any SPECIFIC PKS files." />
 <for param="file">
  <fileset dir="${wd}/TRUNK" includes="**/sv_license_core.pks" />
   <wrapfile file="@{file}" />

The wrap target does what you’d expect. It uses the aforementioned wrapfile macro to apply the Oracle Wrap functionality to obfuscate all of the package bodies (*.pkb) in our code set. It also wraps a few sensative files that we want to keep private. You’ll notice the use of the for and sequential commands. These are available because of the ANT-CONTRIB library we included earlier.

<!-- This target ZIPS the files-->
<target name="zip" description="ZIPS the files for the user" >
 <echo message="ZIPPING the final release..." />
 <zip destfile="release_${sv_version}.zip">
  <zipfileset dir="${wd}/TRUNK/app" prefix="sert/app" />
  <zipfileset dir="${wd}/TRUNK/cfg" prefix="sert/cfg" />
  <zipfileset dir="${wd}/TRUNK/ctx" prefix="sert/ctx" />
  <zipfileset dir="${wd}/TRUNK/ins" prefix="sert/ins" />
  <zipfileset dir="${wd}/TRUNK/license" prefix="sert/license" />
  <zipfileset dir="${wd}/TRUNK/logger" prefix="sert/logger" />
  <zipfileset dir="${wd}/TRUNK/pkg" prefix="sert/pkg" />
  <zipfileset dir="${wd}/TRUNK/syn" prefix="sert/syn" />
  <zipfileset dir="${wd}/TRUNK/tbl" prefix="sert/tbl" />
  <zipfileset dir="${wd}/TRUNK/vw" prefix="sert/vw" />
  <zipfileset dir="${wd}/TRUNK/doc" includes="*.pdf" prefix="sert/doc" />
  <zipfileset dir="${wd}/TRUNK/plpdf" prefix="sert/plpdf" />
  <zipfileset dir="${wd}/TRUNK" includes="ins.sql" prefix="sert" />
  <zipfileset dir="${wd}/TRUNK" includes="ins_beta.sql" prefix="sert" />
  <zipfileset dir="${wd}/TRUNK" includes="ins_admin.sql" prefix="sert" />
  <zipfileset dir="${wd}/TRUNK" includes="unins.sql" prefix="sert" />

Once everything is wrapped and ready to roll, we need to zip it up in a nice little package. Here we’re using the zip target to tall ant which files we want to include in the zip. Line 4 indicates the name of the zip file to be created. Lines 5 through 20 indicate which files to include in the zip file.
Examine the following example:

<zipfileset dir="${wd}/TRUNK/cfg" prefix="sert/cfg" />

The dir property tells where to look for the files and the prefix property says what the path should be within the zip file. So you effectively have full control of the directory structure that will be created when the file is unzipped.

In our case we didn’t want to include everything that was in our source tree as there are some things that are really there just for us. By using multiple zipfileset directives, we selectively choose which files to include.

<!-- This target deletes the working directory-->
<target name="teardown" depends="properties">
 <delete dir="${wd}" />

The teardown target simply deletes the working directory after we’re done with it. This is the last of the real grunt work.

Everything from here out is merely a set of “empty” targets that do no real work of their own. They simply reference the things that need to happen in their depends clause and echo out some information for the user.

<!-- This target BUILDS the full release set-->
<target name="release" description="Full Release Build" depends="properties,co,replace_all,wrap,zip,teardown">
<echo message="Full Build Complete..." />

The first empty target is the full production build and is called release. The depends clause references all the things that should be executed and in which order. In the case of the production build, we see that it will perform the following tasks:
  2. CO
  4. WRAP
  5. ZIP
Since the co target depends on the setup target, we don’t have to call it specifically in the list.

<!-- This target BUILDS the full release set without wrapping-->
<target name="nowrap" description="Full Release Build - No Wrapping of PL/SQL" depends="properties,co,replace_all,zip,teardown">
<echo message="Unwrapped Build Complete..." />

The second empty target (nowrap) creates a zip file that is basically the same as previous release target, but notice that it does not do the wrapping of the PL/SQL code. We use this target to create test build so we can debug any issues we may find while testing.

<!-- This target BUILDS the full release set--> 
<target name="clone" description="Full Release Build - No Wrapping of PL/SQL" depends="properties,co,replace_clone,zip,teardown">
<echo message="CLONE Build Complete..." />

The last empty target (clone) creates the clone that I spoke of above. It only processes a small subsection of the replacement variables and leaves the remainder intact. Again, this is for cloning a development environment. You may not need this.

Once your script it’s done it’s as simple as executing a simple command line call.

ant <target> -D variable_name=value

For instance, to do a full release build based on the targets in my build.xml file:

ant release -D sv_version=020100

While we don’t have automated builds using something like Maven or Hudson, it would be entirely possible. I hope this has been of some help to those of you who were interested.