Royal Society of NSW News & Events

Royal Society of NSW News & Events
Super User has not set their biography yet

1227th Ordinary General Meeting

"A drop of Optics"

Dr Steve Lee and Dr Tri Phan, joint winners of the 2014 ANSTO Eureka Prize for Innovative Use of Technology

Date: Wednesday 5 November 2014

The talk at the 1227th AGM was presented by Dr Steve Lee and Dr Tri Phan, joint winners of the 2014 ANSTO Eureka Prize for Innovative Use of Technology. They received the award for developing a very inexpensive polymer lens with extraordinarily high resolution that can be used on cameras like those found on mobile telephones.

In recent years, miniaturisation has revolutionised sensors: small image sensors means that the optical device can also be miniaturised and it is much easier to get good optical qualities in a small lens than a large one. The early miniaturised lenses were ground from small pieces of glass and were quite expensive to manufacture. However, with the development of polymers with good optical qualities, high-quality lenses can now be moulded rather than being ground. But if surface tension is allowed to create the lens surface rather than moulding, surface roughness (which is almost impossible to avoid with any moulding process) can be largely eliminated.

A familiar example is the optical quality of raindrops but if the liquid used to form the lens is of much greater viscosity than water, for example, viscoelastic polymers, gravity can be used to shape the surface of the lens to give specific optical properties. The technique that Dr Lee and Dr Phan developed was to use highly viscous polydimethylsiloxane in as the polymer (this is also referred to a silicone polymer) and to suspend droplets from a small orifice so that gravity forms droplets with curvature that has the right optical characteristics. The silicone polymer can be cross-linked and thus stabilise it shape simply by putting it into an oven to cure. Different lens geometries can be obtained by applying several layers of polymer with intermediate curing steps.

One application that Dr Lee and Dr Phan have developed is to use these lenses to clip onto the cameras on mobile telephones. Standard lens on mobile telephone (these are moulded) has a roughness of 200 nm whereas the elastomer lens is around 10 nm. Consequently, much finer detail can be resolved using the polymer lens. The opportunity is to integrate lenses such as these into smart phones and use these for diagnostic and remote sensing applications.

The technology was just released at the Google "The Mobile First World" conference in Taiwan.

188 Hits

1226th Ordinary General Meeting

"Australia's most spectacular environmental rehabilitation project: Phillip Island, Pacific Ocean"

Dr Peter Coyn

Date: Wednesday, 1 October 2014

Perched atop a submerged seamount, in turn atop a submarine ridge, Phillip Island and its close neighbour Norfolk Island are tiny specks, the only land in a vast expanse (2.5 million square kilometres) of the southwest Pacific Ocean. Both islands were created by volcanic activity between 2.8 and 2.2 million years ago. The plateau top of the seamount, 100 x 35 kilometres, is between 30 and 75 metres below present sea level. Sequential ice ages during the last 2 million years exposed the entire plateau, an area about 100 times the size of the present islands. Such an area could have accommodated about four times as many species as the present islands. During the last ice age the entire plateau was exposed for 24,000 years until 13,000 years ago. Sea level 25 metres higher, reached 10,000 years ago, still exposed an island about 35 km long, large enough to accommodate more than double the species count of the present islands and joining these islands with dry land. An island at least this large was exposed for 60,000 years during the last ice age, before the sea reached its present level just 6,000 years ago. The generally much larger size of the islands and the ecological stress caused by their declining area, and the consequent loss of three-quarters of their species, between 13,000 and 6,000 years ago, could help explain the great biological value of the islands and Phillip Island specifically.

Phillip Island was densely vegetated when pigs were released there in 1793, followed by goats and rabbits by 1830. The feral grazers quickly destroyed the vegetation and by 1860 the island was mostly bare. Photographs dated 1906, when only rabbits remained, show landscapes almost identical to those of 1980 — almost no vegetation was present. In 1979 Dr Coyne began a three-year experimental program to investigate the effects of the rabbits and potential for vegetation re-establishment. The work was physically difficult and often hazardous. The first year's results were enough to persuade decision-makers the rabbits should be eradicated. That work began in 1981 and by 1986 the rabbits had been destroyed by a combination of an artificial strain of myxoma virus, poisoning, shooting, trapping and fumigating. The eradication program required swimming to habitat accessible only from the sea, archery to distribute the myxoma vector (rabbit fleas) to other inaccessible areas of habitat, and a lot of rock climbing on cliffs to 250 metres high. Since then the island has been transformed by new vegetation, most arising spontaneously. Some of the world's rarest plant species have been discovered, rediscovered or have increased in numbers. One has only a single genotype, two have fewer than fifty individuals and another has fewer than 250 individuals. A genus and species endemic to Phillip Island sadly was not rediscovered and at least two Phillip Island plants are extinct. Fauna have also benefitted from the revegetation, and being free of rats and cats the island has potential as a refuge for threatened fauna endemic to Norfolk Island.

191 Hits

1225th Ordinary General Meeting

"The Fourth Dimension and Beyond - the paradox of working in unimaginable worlds"

Emeritus Scienta Professor Ian Sloan AO FRSN

Date: Wednesday 3 September 2014

Professor Ian Sloan is not content to work in an environment of four dimensions – he is quite at home in space with many more dimensions than most of us are accustomed to. Many mathematical problems can be considered as problems and multidimensional space – the question is how do we imagine these environments? The dimensions of a space can be considered to be the number of directions that you can go from any single point within it. For example, in our four-dimensional world, from any point we can go in three spacial directions, plus time. If we are in a six-dimensional environment, we can go in six directions from any given point and mathematically we don't need more than six variables to describe this environment. But why would we be interested in multidimensional spaces?

Many problems are best analysed in multi-dimensions. For example, the shop may have 250 stock items. This can be thought of as a single point in 250-dimensional space. Each stock item has a price, so there are another 250 dimensions to consider. One area where this approach has a major application is evaluating certain types of financial transactions such as derivatives.

An investor may want to analyse a potential investment in Woolworth shares, for example. The payoff might be thought of as the closing share-price over a period of 250 trading days in, say, $10 increments. Using multidimensional mathematics, the investor can calculate the expected payoff at a certain trading day on the basis of what the closing share price might be. Such calculations soon become extremely complex, in fact, too complex to be evaluated (this is known as "the curse of dimensionality" a term coined by Richard Bellman, a noted researcher in this field). And what if the payoff can take any value, rather than being in $10 increments – does this make it even harder? Well, fortunately it does not.

Using a statistical approach known as the Monte Carlo method, these highly complex functions can be evaluated quite accurately. The Monte Carlo method may be thought of as a technique whereby one randomly throws points at a target and evaluates whether the points fall on the target or outside it. If the target can be mathematically described, the functions can be evaluated with considerable accuracy after no more than a few thousand random "throws". This enables highly-complex derivative functions to be evaluated quite accurately. The problem then lies in the assumptions underlying the mathematical model used to define the "target". Flaws in the underlying assumptions have resulted in many a lost fortune!

Despite the accuracy of the Monte Carlo method, with highly complex functions the number of random throws can become very large and the question arises, can we do better than generating the throws randomly? For some problems, using a pattern, for example, a lattice has been shown to converge much more quickly.

Multidimensional mathematics is one of our most powerful tools in solving problems from financial derivatives, to metadata analysis, to cosmology. Professor Sloan provided a particularly clear insight into a highly complex and very powerful mathematical technique.

207 Hits

1224th Ordinary General Meeting

Wednesday, 6 August 2014

"Saving Australia through science education"

Emeritus Scienta Professor Eugenie Lumbers AM Dist FRSN

The world is experiencing an exponential rate of technological progress. Change was relatively gradual from the time and the domestication of the horse until the 17th century. Indeed, in the early stages the Industrial Revolution, industry was still heavily dependent on horse-drawn transport. In 1900, just 14 years after the invention of the motor car, there were still 300,000 horses in service in London. That same year, there were 0.11 cars per thousand people in the US; in 2009 there are 828. This enormous, rapidly accelerating technological change took place as a consequence of science and its application in development of technology. The question is why was such enthusiasm for science in the 1940s and 1950s but this has largely disappeared today in many countries, not the least of which is Australia. This poses a major challenge for Australia – how will we keep up with technological progress when few people are interested in seeking a science or technological education? Despite the apparent interest in science, in a multitude of TV programmes for example, this is actually positioning science as entertainment, not as true science.

Despite this rapid shift away from science, Australia was still doing well by international standards until the late 20th century. In 2000, Australia ranked number three in the world (after South Korea and Japan) in the OECD Programme for International Student Assessment (PISA) test, a test that measures problem-solving capability in 15-year-olds. In the latest test, in 2012, Australia ranks number 8 (after Singapore, South Korea, Japan, China, Hong Kong, Taipei and Canada). It is not surprising that Australia's ranking is slipping when only 51% take a science subject in year 12 and less than 20% studied chemistry or physics. (Interestingly, biology is somewhat higher at 25% because it is seen as being "less academic".) What will the future look hold when the technologically-educated people of today are gone? It is extraordinary that 76% of Australians do not see science directly relevant to themselves but important to Australia's future.

The Academy of sciences tried to address this through its "Primary Connections" programme and inquiry brace programme to help teachers develop their teaching programmes and to provide curriculum resources. Similarly, the Academy's "Science by Doing" programme to secondary schools is aimed at stimulating the all-important interest and enjoyment in science for children in early secondary school so that they go on to choose a career in science.

252 Hits

New Fellows appointed 2014

On Wednesday 2 July 2014, a group of eleven Fellows were presented with their certificates. Thirty-six Fellows have been appointed since the new Rules were introduced in December 2013.

Left to right: Donald Hector (President), Bill Hogarth, Heather Goodall, Ron Johnston, Roy MacLeod, Robert Marks, John Simons, Judy Raper, Brynn Hibbert, the Hon Peter Baume AC, Thomas Maschmeyer, Des Griffin AM.
281 Hits

1222nd Ordinary General Meeting

Wednesday, 2 July 2014

"What causes MS? The impact of the genetic revolution"

Professor Graeme Stewart AM

Professor Graeme Stewart AM, director of clinical immunology at Westmead Hospital has researched the genetic influences on disease, in particular on multiple sclerosis (MS). MS is the commonest chronic neurological disorder of young at all. It usually starts with a relapsing/remitting phase (with symptoms occur and then go into remission at for extended periods), usually with onset at about the age of 30. The disease can be relatively benign with periods of disability, it can present as a relapsing/remitting disease with gradual increase in disability, or in about 10-20% of patients it can present as being "primary progressive", where disability progressively increases over time. MS is caused by the body's immune system malfunctioning – macrophages devour the myelin sheath around nerve cells, exposing the nerve axon and thereby disrupting the flow of information along the nerve cell. The body is able to repair the damage by re-myelinating the nerve cells after this initial attack however if the myelin is attacked the second time in the same place, the body is unable to repair the sheath and relapse occurs. Hence the symptoms of the disease progress.p>

The important question: is what causes this? MS is a disease which is clearly influenced by genes and environment. Studies of the disease in identical twins show 30% concordance, whereas in fraternal twins there is 4% concordance. The background incidence rate of MS is 0.4% of the population. This suggests that genetic influences are very significant but environmental factors are also a consideration. The interesting environmental effect is that the incidence of MS is quite highly correlated to latitude – for example, in Australia there is a 4 to 7 times hazard ratio between North Queensland and Tasmania – in the southern hemisphere, the further south you live, the more likely you are to contract MS. The most likely reason for this is the reduced exposure to UV-B light the further you are from the equator and vitamin D deficiency. Other environmental factors include smoking and exposure to the Epstein-Barr virus that causes glandular fever (almost all MS patients have been affected with Epstein-Barr virus). The fact that Epstein-Barr virus is implicated in virtually all MS cases may present an opportunity for treatment if the effect of this virus on DNA is understood.p>

Since the early 1970s, there has been a search for the genes implicated in MS. The first was found in 1972 but it was not until 2007 that the second gene was identified. Since then, as a consequence of the human genome project and widespread sequencing technology, together with the recent advances in computer power and statistical algorithms to handle large amounts of data, there have been over 100 genes identified. Pursuing genetic associations is expected to give insight into the pathogenesis, in particular the interaction between genes and environment. It is hoped that this will lead to interventions to prevent the disease from progressing. In addition, identifying genetic biomarkers may provide major opportunities for new treatments, including personalised treatments based on the individuals genetic profile.

There has been substantial progress in treatments for MS, including trials of drugs to stop T cells crossing the blood-brain barrier, drugs that capture lymphocytes and hold them in the lymph nodes and early indications that drugs targeting specific proteins identified through genetic analysis might be useful. In addition, trials are underway to see whether large doses of vitamin D might have some impact and whether increased exposure to ultraviolet light might also offer some improvement.

154 Hits

Annual awards evening and dinner 2014

On Wednesday 7 May, the annual awards evening and annual dinner was held at the Union University and Schools Club in Sydney. The dinner was extremely well attended and the address by Professor Barry Jones AC FAA FACE FAHA FASSA FTSE DistFRSN on the attack on the scientific method stimulated a lot of discussion. During the evening, the Society's 2013 awards were presented and the inaugural group of eleven Fellows were presented with their certificates.

Back row: Benjamin Eggleton, Jerome Vanclay, Richard Banati, Ian Dawes, John Gascoigne. Front row: Aibing Yu, Ian Sloan, Judith Wheeldon, Donald Hector (President), Heinrich Hora, Merlin Crossley, Trevor Hambley

The President, Dr Donald Hector, presented the Society's 2013 awards. The Edgeworth David Medal was presented to Assoc Prof David Wilson, for his outstanding work on modelling HIV/AIDS and using this information to develop treatment and prevention strategies. Prof Michelle Simmons DistFRSN was awarded the Walter Burfitt Medal and Prize and Professor Brien Holden was awarded the James Cook Medal for his work in treating myopia (a leading cause of preventable blindness), particularly in developing world countries. The Clarke Medal could not be presented to distinguished geologist William Griffin, as he was overseas and unable to attend.

Left to right: Assoc Prof David Wilson, President Dr Donald Hector, Prof Brien Holden and Prof Michelle Simmons DistFRSN.
314 Hits

Distinguished Fellow's Lecture 2014

The Society was proud to have Professor Barry Jones AC DistFRSN present the second annual Distinguished Fellow's Lecture at the Society's annual dinner on Wednesday 7 May 2014. Professor Jones is the only person to be a Fellow of all four of Australia's learned Academies.

Prof Barry Jones AC DistFRSN delivers the second Royal Society of NSW Distinguished Fellow's Lecture.
281 Hits

1221st Ordinary General Meeting

Wednesday, 7 May 2014

"What lessons have we learnt from the Global Financial Crisis?"

Professor Robert Marks

In 2008, the world suffered "the equivalent of cardiac arrest", according to the Financial Times. It became virtually impossible for any institution to finance itself, (that is, borrow in the markets) longer than overnight. With the collapse of Lehman Bros, interbank credit markets froze and counterparty risk was considered to be too great for prospective lenders to take on the transactions. The London interbank overnight lending rate, typically in the range of 0.2% to 0.8% spiked to over 3%. This situation raises two questions: what caused this global financial crisis (GFC)? and how can we attempt to avoid similar crises in the future? The origins of the crisis go back more than 30 years.

Starting in 1977, there were substantial changes made to US investment legislation. Early in this period, the aim was to make finance more readily available to low-income borrowers, to progressively eliminate using the controls on mortgage rates and to remove discrimination in the US housing market. In 1999 and 2000, there was substantial deregulation, with substantial changes to long-standing legislation, in particular the repeal of the Glass-Steagall act of 1933 that had imposed restrictions banks during the Great Depression. There were also reforms to the Federal housing finance regulatory agencies, loosening their lending requirements.

This period of financial deregulation encouraged consolidation and demutualisation of many financial institutions that had been mutually or privately owned, with these being floated as public companies. Whereas previously their lending practices had been conservative as they had been risking their own money, now the money at risk belonged to other people! There was also great creativity in developing new financial products and instruments: Mortgage-Backed Securities (MBS), structured investment vehicles, Credit Default Swaps (CDS) and Collateral Mortgage Obligations (CMO).

In the early 2000s, the September 11 attacks, coming not long after the bursting of the "tech bubble", led to a prolonged period of low interest rates. US fiscal policy was heavily in deficit leading to massive issuance of US bonds that were largely bought by China and other Asian countries. At the same time there was further financial deregulation, relaxing capital requirements that encouraged higher gearing financial institutions.

Unsurprisingly, firms responded to the incentives put before them. The market for the new financial instruments boomed and rating agencies responded by changing the way in which they charge for their services – they began charging the firms whose products they were rating, rather than the potential buyers of the product. In the US, the financial sector grew from 3.5% of GDP in 1960 to nearly 8% of GDP in 2008.

Drawing these strands together, there were four causes of the GFC: the repeal of the Glass-Steagall act; the decision by Congress not to regulate derivatives; the relaxation of regulations that allowed banks to expand their gearing; and the change by the ratings agencies to charge the issuer rather than the buyer of rated products.

How likely is this type of situation to occur again in the near future?

Unfortunately, a number of European countries may be facing similar challenges unless they take steps to avoid the problems that the US experienced. Fortunately, Australia avoided the worst of the GFC, well-served by the "four pillars" banking policy. However, there needs to be recognition that information is asymmetric and that the issue is really not one of risk but rather of uncertainty, where there are no simple answers. As George Santayana observed in 1905, "those who cannot remember the past are condemned to repeat it".

199 Hits

Governor invests new Distinguished Fellow

On Wednesday 16 April, Prof Peter Doherty AC Dist FRSN was formally invested by our patron, the Governor, Prof Marie Bashir AC CVO at a ceremony at Government House. Professor Doherty was appointed a Fellow in September 2013 and a Distinguished Fellow in December 2013.
President Dr Donald Hector (left), Prof Peter Doherty after receiving his Distinguished Fellowship, the Governor and Vice President Em Prof Brynn Hibbert (right) at Government House, Sydney.
380 Hits

1220th Ordinary General Meeting

Wednesday, 2 April 2014

"The Jameson cell"

Laureate Professor Graeme Jameson AO

At the 1220th ordinary general meeting of the Society, Laureate Professor Graeme Jameson described the development of the Jameson cell, one of the most important technological contributions to the Australian economy in the last 50 years.

The Jameson cell is a flotation cell used to concentrate the valuable components on ore in minerals processing. In a typical mining operation, the first two stages of extracting minerals are the mine itself from which the ore is recovered and the concentrator, where the valuable mineral is extracted from the rest. Generally, the valuable components are no more than 2% of the ore recovered, so there is a massive challenge in isolating this from spoil for further processing. An important technology developed to achieve this concentration step was the flotation cell, a process first developed early in the 20th century.

In a flotation technology, the ore is ground up into very fine particles and dispersed with water and surfactants in a large mixing vessel that can be kept agitated and into the bottom of which compressed air can be introduced. Reagents are added to make hydrophobic the valuable mineral particles exposed during the crushing. Air is bubbled through the suspension and the hydrophobic mineral particles attach to the bubbles, float to the surface as a froth and then are skimmed off for further processing and enrichment. Because large volumes of ore have to be treated to recover a relatively small proportion of valuable product, this is a very expensive step in recovering minerals: first, the ore has to be ground to very fine particle sizes (typically around 150 micrometres) – this takes a lot of energy; and second, the volume that has to be treated in preparing the slurry is large, so processing equipment is big and expensive. Any technology that reduces either the cost of grinding or the size of the processing equipment can have a major impact on the cost of production. The Jameson cell revolutionised the floatation process by reducing the size of the equipment needed to efficiently float off the minerals.

Over a period of several years, Professor Jamieson identified the optimum parameters for particle size and the corresponding optimum size for the air bubbles used to float the treated particles. Generally, particle size needs to be less than 150 micrometres, or, even better, less than 100 micrometres. The smaller the particle, the more likely it is to consist of the pure mineral. But the real technological breakthrough was identifying that the optimum bubble size is about 300 micrometres. Until then, conventional cells operated using bubbles about three times that size at about 1 mm diameter. Having identified the optimum bubble size, the challenge was then to design equipment that produced the right amount of sheer to generate bubbles of 300 micrometres diameter . This turned out to be relatively simple, using high pressure jets of water to entrain the air.

Much of the commercialisation work was done at Mount Isa in the 1980s and 1990s. Since then, the cell has been deployed around the world and is used routinely to extract coal, copper, lead, zinc and potash and is used in other industries such as oil-sands extraction and industrial waste treatment. The over 300 cells have been installed and the cumulative value created by this invention is more than $25 billion.

Professor Jameson was named NSW Scientist of the Year in 2013.

219 Hits

1219th Ordinary General Meeting

Wednesday, 5 March 2014

"Big data knowledge discovery: machine learning meets natural science"

Professor Hugh Durrant-Whyte FRS, CEO, National ICT Australia

Hugh Durrant-Whyte is an internationally-recognised expert on the analysis of "big data" – the mass of information that is being generated around current information and communication technologies. Much of this is "metadata" – data that is captured as part of some activity (for example, when a digital photograph is taken also recording camera settings, capture date etc or the data kept by telecommunication companies every time a mobile phone call is made).

2.5×1018 bytes of data are generated every day – there is immense value in mining this data but this requires sophisticated analytical techniques. "Data analytics" is the term coined for technologies to analyse this data in areas as varied as the finance industry, the health industry, planning infrastructure, failure analysis in mechanical and electronic equipment and environmental analysis, to name but a few examples. Data analytics utilises Bayesian probability theory (named after Rev Thomas Bayes, an 18th century mathematician) to prepare quantitative models of existing data, gathering new data to address remaining problems and then updating model to incorporate both the old and new data.

Data analytics can be modelled using three types of mathematical functions: discrete functions that describe, for example, events or people's actions; finite probability functions, such as signals or locations and infinite probability functions such as spatial fields or temporal fields. As the masses of data available increase, the analysis can converge on knowledge. For example, payment patterns exhibited by individuals can be aggregated to behaviours of bank branch customers, giving an understanding of consumer behaviour. On the other side of the table, customers can utilise masses of data to take advantage of the best deals available or to customise internet-based content that they may wish to buy.

Where masses of historical data are available (for example, managing water assets) readily available historical parameters can be analysed for such applications as predicting equipment failures. In the case of water asset management, pipe age, soil type etc can be analysed to give a probabilistic analysis of when a water main might fail.

The mining industry has invested large amounts of money in developing systems to utilise masses of existing information to automate mine operation. This can take all available data around the surface of the mine, the subsurface, mapping, drilling, to create a completely integrated data model into a single, real-time representation of the mine.

The purpose of National ICT Australia (NICTA) is to utilise these data analytics approaches to produce leading-edge technologies and models for such varied applications as financial modelling, creating large-scale fully integrated data maps of regions (perhaps even as large as continental Australia). There is also a particular focus on data-driven discovery in the natural sciences in applications as varied as modelling ancient plate tectonics to predict mineralisation (on a timeframe of as much is 1.5 billion years) or ecological modelling, for example, predicting the growth of trees. Ultimately, these may be able to be integrated into one massive model of the Australian continent.

249 Hits

Four Societies Lecture 2014

Thursday, 27 February 2014

"Questions of power in NSW"

Professor Mary O'Kane, NSW Chief Scientist and Engineer

At the annual Four Societies Lecture, Professor Mary O'Kane considered the major questions that face NSW in the future of energy production and utilisation. Asking the right questions is key – it reduces the time taken to identify the best solutions.

Australia is the ninth largest energy producer in the world and one of only four net energy exporters. We have 38% of the world's uranium, 9% of the world's coal and 2% of the world's gas. In terms of consumption, agriculture takes 3%, mining 13.5%, manufacturing and construction 25%, transportation 38% and residential about 11%. The 2014 Commonwealth Energy White Paper is seeking to address a number of questions regarding Australia's energy future. These include security of energy sources, the role of government and regulatory implications, growth and investment, trade in international relations, productivity and efficiency and alternative and emerging energy sources and technologies.

A recent report by the Grattan Institute identified a number of important issues. Australia has a lot of gas and coal, yet has yet to fully consider the impact of having no clear climate change policy. There is also the question of how can the electrical system (particularly one based on large generation units interconnected by a grid) meet the challenge of occasional very high peak demand. The Grattan Institute also posed questions around the balance of market and regulation and the importance of getting this right and explored the implications of new technologies and whether these provide potential solutions.

Australia is not unique in facing these challenges. One approach being taken in the US has been to establish an energy agency using a model was originally conceived for advanced research projects for the defence industry. ARPA-E, or the Advanced Research Projects Agency-Energy and was established to fund high risk/high reward research to identify new technologies for energy in the US. The research programmes in their portfolio relate to reconceiving the grid, the impact of micro grids, the impact of analysing big data, the gas revolution, new ways to get higher efficiencies, entirely new technologies, the best policy settings to encourage the adoption of new technologies and innovative models for research and development. Perhaps these sorts of approaches need to be utilised in NSW.

Questions that need to be addressed are what about nuclear energy? To what extent is geothermal energy applicable? How should we gain new efficiencies? How can we better optimise grid storage and geometry? What are the downsides of these various technologies? Are there opportunities to directly and export to our immediate neighbours (e.g. Indonesia)? How effective is Australia's energy R&D?

Professor O'Kane summarised the issues in three searching questions. First, how do we characterise a system that we want and the process to realise it? (What are the most important characteristics that our energy future must have, would be nice to have? What energy futures do we definitely not want?) Second, who should be responsible for demonstrating new technologies (responsible for progress, experiment, scale up, economic model and "energy equity")? And third, how can we have the best system possible? We must become expert at asking the questions and seeking solutions around the world and, importantly, developing solutions locally where appropriate in order to create a leadership position.

147 Hits

Joint meeting with the Australian Academy of Forensic Sciences

Thursday, 19 February 2014

"Searching for clues: Unmasking art fraud and fraudsters" - Associate Professor Robyn Sloggett

At the first joint meeting of the Society and the Australian Academy of Forensics Sciences, Professor Robyn Sloggitt explain the approach taken by forensics scientists in investigating prosecuting cultural heritage offences. The difficulty that faces authorities is determining whether or not cultural records are true and verifiable. Forensic examination used in these situations follows the Locard principle (named after Edmond Locard the pioneering French forensics scientist) that "every contact leaves a trace".

In order to determine the provenance of works of art, the forensics it seeks to establish how the object was made, what it is made of, when it was made and where it was made. But it is not attempt to determine who made it. An important foundation of provenance is establishing a body of work that can be used as a reference. Here, art and science converge. For example, in order to determine whether or not a painting might have been painted by Rembrandt, it is known that Rembrandt lived from 1608 to 1669, that certain types of pigments but not others were available in that era, canvas and other materials need to be consistent and so on. But none of these, on their own although necessary, can be sufficient to establish authenticity. As philosopher, Karl Popper put it testing cannot establish authenticity, rather it can falsify it.

When forensics science is used to determination whether there has been fraud take place a number of questions have to be answered: was the financial benefit question? was there deliberate deception? was a work that is non-authentic passed off as being authentic? Professor Sloggitt referred to a notable case in which the Australian painter and art copyist, William Blundell, painted works in the style of a number of well-known Australian artists. He referred to these works as "innuendos", stating that they were intended be used for decorative purposes only were not passed off as originals typically being sold for only a few hundred dollars and us, there was no intention to defraud.

When authentication is required, the forensics approach is accommodation of scientific analysis, gathering historical facts, attempting to verify the provenance, weighing up evidence against the item being authentic these are considered together in order to reach a determination that in the balance of probabilities as to whether or not the item is indeed authentic. This depends on the availability of good databases and a logical development of the case, with corroborative evidence and expert knowledge and opinion. The forensics process can be considered to be in two parts: investigation of primary sources; and secondary sources. Primary sources are either invasive or non-invasive, with invasive techniques including methods such as laboratory-based analysis of materials. Non-invasive methods such as spectroscopy, x-ray diffraction, electron might be, Rahman and Fourier infrared analysis in recent decades have become important tools.

Typically, the first steps are to examine documents regarding the artefact and then investigate materials, such as the frame, paints, brushstrokes, and finishing techniques. Contaminants can also be useful such as pollen, dirt, fingerprints, as are ageing characteristics and the effect of the environment. Later changes can also be important. Secondary sources are then investigated and included observations such as style and technique but these are more difficult to deal with as they are subjective and often expert opinion is divided.

Professor Sloggitt gave some insight into a number of notorious fraud cases, one being Wolfgang Beltracchi who forged over 200 works that were passed off as being pre-World War II works by famous European artists such as Max Ernst, Heinrich Campendonk and Fernand Lédger. The amount involved was $48 million over 15 years and resulted in a gaol term of six years, considered to be a rather light sentence. In Europe, art fraud cases are relatively common but in Australia are quite rare. One reason for this that there is no art fraud squad in Australia, so criminal prosecutions are rare because is no professional expertise in identifying and tracking down criminal art fraud cases and taking them to prosecution.

There was an interesting discussion after the talk that continued over the first joint dinner enjoyed by both the Society and the Academy members.

238 Hits

2014 Sydney Lecture Series

Meetings are held at various venues in Sydney (be sure to check the web-site a few days before the event for final venue details). Unless indicated, booking is not necessary. All welcome. Meetings usually commence at 6:00 pm for 6:30pm.

Entry is $5 for members of the Society and $20 for non-members to cover venue hire and a welcome drink. We often have dinner after the meeting (the cost is $75 per head). Pre-booking is appreciated.

163 Hits

1215th Ordinary General Meeting

Wednesday, 2 October 2013

"Astrobiology: the latest from 'Curiosity'" - Professor Malcolm Walter

"Seven seconds of terror" was how the operators at the Jet Propulsion Laboratory in the US describe the landing of 'Curiosity", the latest rover mission that landed on Mars in August last year. In the last stage of the landing, the entry vehicle hovered about 80 m above the surface of Mars and lowered Curiosity (which weighs nearly a tonne) by cranes to a gentle touch-down. Given that it can take up to 20 minutes for signals to reach Mars (or a up to a 40 minute round-trip) there is a significant delay that constrains the Earth-based control station.

The purpose of the curiosity mission is to understand the geological and biological context to determine whether life may have existed or, indeed, still exist on Mars. Mars is somewhat smaller than the Earth with the surface area of Mars being about the same as the exposed surface area of the Earth's continents. Until as recently as 60 years ago, up it was thought that advanced life may have once existed on Mars and could have been responsible for the canals and other geological phenomena that have been observed through telescopes. It is now thought that the most advanced form of life to be possible on Mars would be single cell organisms, probably similar to those that existed on earth in the early stages of life. To put this in perspective life first appeared on earth about 3500 million years ago and, until about 500 million years ago, consisted entirely of single cell organisms. Nearly all of the diversity of life on earth is microscopic, so it makes sense to look for this as the first signs of life in other places in the universe.

One way to understand what early life might look like is to examine geological formations in very old rocks, such as the 3,500 million-year-old rocks in the Pilbara. Fortunately, these rocks are of great interest to geologists because they often hold valuable mineral deposits, so quite a lot is known about them. They are known to have been formed by volcanic action, so a second, complimentary approach is to see what forms of life exist in active volcanoes. One such volcano is White Island in New Zealand. Single cell life forms have been found there in water up to 123°C, so it is now known that life can exist from about -30°C to over 120°C.

In order to try to understand the evolutionary context of these single cell organisms, biologists look at bio-markers in the geological samples that are characteristic of life and see how these evolve. This is analogous to looking at skeleton evolution in more advanced life forms. Already, a great deal has been learned about the geological environment on Mars. An early mission, Phoenix, found ice at northern latitudes. The channels suggest that there was flowing liquid at one point in Mars' geological history. That was almost certainly water. Imaging shows that there is still channel formation taking place on the surface of Mars now which suggests that at times at least there is fluid flow. It is too cold for pure water, so if indeed this turns out to be due to rivers, they would have to be highly saline to be liquid at these temperatures.

Earlier investigations suggested that there was methane in the Martian atmosphere, however Curiosity has found none. The earlier observations are now thought to be due to a C-13 isotope of methane in the earth's atmosphere.

Curiosity is an extremely expensive mission – it takes 265 people every day to keep it running but the contribution to our understanding of Mars and the origins of the solar system and, by implication other phenomena in the universe is enormous. There are a further 15 missions planned by various public and private agencies over the next decade or so.

153 Hits

1214th Ordinary General Meeting

Wednesday, 4 September 2013

"Open science" - Dr Matthew Todd

The speaker at the Society's 1214th ordinary general meeting was Dr Matthew Todd, a Senior Lecturer in Chemistry at the University of Sydney who is a leading proponent of the concept of "open science".

Dr Todd began with an example of the type of problem to which open science can provide a very practical solution. In Africa and parts of South America and Asia, the parasitic disease schistosomiasis (also known as bilharzia or snail fever) is endemic. Schistosomiasis is caused by infection by water-borne parasites that penetrate the skin and enter the bloodstream. Although the mortality rate is low, schistosomiasis is a serious chronic illness. It is particularly devastating to children – it damages internal organs, impairs growth and causes cognitive impairment. After malaria, it is the most socio-economically devastating disease in the world.

Schistosomiasis can be treated by a drug called praziquantel that is inexpensive and is administered orally. The problem is that praziquantel tablets are very bitter to the taste and, consequently, many people do not complete the course of treatment. But praziquantel is an organic molecule that exists as two stereoisomers (stereoisomers are molecules that exist in two forms, one being the mirror-image of the other in much the same way as the is the left hand is the mirror-image of the right hand). Often in pharmacology, only one of the stereoisomers has the desired physiological effect and, indeed, this is the case with praziquantel. The "R" stereoisomer kills the parasite and does not have an unpleasant taste. The "S" stereoisomer is inactive and, fortuitously, is entirely responsible for the bitter taste. So why not simply make the R-form? Unfortunately, both forms are produced together in the reactions which are commonly used for synthesising this drug and are not easily separated in the manufacturing process. The best solution is to find catalysts and reaction conditions that favour the production of the desired stereoisomer over the other. However, there is no public funding available for the research and private enterprise will not fund it because the drug is so cheap that the financial return too low.

Another problem is that the normal research paradigm is sequential: a research grant is awarded; the work is done; the results are published and if encouraging, will perhaps result in further research grant. This can be dreadfully slow and a far more efficient way of solving complex problems of this nature is to have collaborative research that can proceed concurrently rather than sequentially - parallel rather than serial processing, as it were. There are number of examples of this type of collaboration being successful in areas such as astronomy, mathematics and biology. Dr Todd and his group at Sydney University explored using the open science approach to develop the manufacturing approach for the active, tasteless R-stereoisomer of praziquantel.

This approach resulted in rapid progress through collaboration of groups around the world, with at least two routes identified as potential practical manufacturing steps.

Dr Todd argues that the whole process of science is based on openness, the sharing of results and collaboration. Issues around patterns can be important but many of the key discoveries of the last century or so have not been subject to patent protection.

208 Hits

Compendium of 2013 news

The Poggendorf Lecture

"Biodiversity and the future of agriculture" - Professor Geoff Gurr

After a hiatus of 20 years, the Poggendorf Lecture was delivered in conjunction with Charles Sturt University, Orange, on Tuesday, 13 August 2013. The lecture was delivered by Professor Geoff Gurr, a biologist and entomologist and Professor of Applied Ecology at Charles Sturt University, where he specialises in the utilisation of natural solutions to control agricultural pests to partially or completely replace synthetic pesticides. 

The population of the world is increasing by 170,000 souls per day. Currently, 40% of land is used for some agricultural purpose and the demand for agricultural products is expected to increase not only as a consequence of population growth but by the increasing living standards of people in the developing world. For example, the growth in meat demand is very strong and it takes 10 kg of grain to produce 1 kg of animal protein. This leads to the conclusion that food production needs to double by 2050. The so-called "green revolution" of the last few decades has enabled the increase in food production to largely match population growth, largely through the application of nitrogen, phosphorus, some trace elements, water and the wide-scale use of pesticides. But was this revolution truly "green"? Human inputs are largely non-renewable but, importantly, do not actually address the root cause of the problem – pest outbreaks are not due to a lack of pesticide, they are due to other imbalances in the environment. So the world is faced with a "wicked problem" of seeking food security, having finite renewable resources, a declining availability of agricultural land, changing climate and a moral obligation to preserve biodiversity (human activity, including agriculture, causes biodiversity loss at a rate about 10,000 times greater than the background rate). 

Royal Society of NSW Forum 2013

Left to right: Antony Funnell, Prof Schmidt, Ms Wheeldon, Prof Schwartz, Prof Crossley.

The Royal Society of NSW Forum 2013 was held at the Powerhouse Museum on Thursday 6 June before a large audience. Antony Funnell of the ABC's Radio National moderated the discussion between: 

  • Professor Brian Schmidt AC FRSN, Nobel Prize winner 
  • Professor Steven Schwartz AM, former Macquarie University Vice Chancellor 
  • Ms Judith Wheeldon AM, former Principal of both Queenwood School for Girls and Abbotsleigh 
  • Professor Merlin Crossley, Dean of Science at the University of NSW 

Among other questions, our panellists discussed: will a falling focus on science and technology in education really be a problem for innovation in Australia? Is it a matter of basic education? Is it poor teaching? Is there a fundamental aversion to maths and science in Australia? Given our reliance on technology, why is there not a greater desire to utilise it and to develop it? Is there a "science literacy" problem in Australia? Why have we become passive about science and technology, rather than embracing it at its fundamental levels? 

In case you missed it, it was broadcast on ABC Radio National Big Ideas on Monday 17 June (click Forum 2013 to download a recording of the broadcast).

Annual awards evening and dinner

On Friday 19 April, the annual awards evening and annual dinner was held at the Union University and Schools Club in Sydney. The dinner was extremely well attended and the address by Judith Wheeldon AM was very topical and stimulated a lot of discussion. Ms Wheeldon presented the Clarke Medal to distinguished zoologist Marilyn Renfree, the Edgeworth David Medal to Dr Joanne Whittaker, a remarkable young geophysicist who is doing ground-breaking work on plate tectonics, and the Royal Society of NSW medal to John Hardie in recognition of his 40 years of contribution to the Society, six of which have been as its President.

Left to right: The President, Dr Donald Hector, Judith Wheeldon AM, Professor Marilyn Renfree AO, John Hardie MRSN, Dr Joanne Whittaker.

Inaugural Fellows Lecture held

The Society was proud to have Professor Michael Archer AM present the inaugural Fellows Lecture on Wednesday, 3 April 2013. Professor Archer was one of the first Fellows appointed by the Society, recognising his outstanding work as a palaeontologist, particularly in relation to the Riversleigh fossil find in Queensland, one of the richest fossil deposits in the world.

Prof Mike Archer AM FRSN delivers the inaugural Royal Society of NSW Fellows Lecture.

Governor invests new Fellows

On Wednesday13 March, the two Fellows appointed in 2012, Prof Brian Schmidt AC FAA FRS FRSN and Prof the Hon Barry Jones AO FAA FAHA FTSE FASSA FRSN were formally invested by our patron, the Governor, Prof Marie Bashir AC CVO at a ceremony at Government House. We were delighted that our awards advisory panel, chaired by the Chief Scientist and Engineer of NSW, Prof Mary O'Kane, and consisting of the Deans of Science of the NSW-based universities were able to attend, together with a number of other distinguished guests.

Prof Brian Schmidt (centre-left) and Prof the Hon Barry Jones (cenre-right) with the the Governor after receiving their Fellowships with Vice President Em. Prof Heinrich Hora (left) and President Dr Donald Hector (right) at Government House, Sydney.
290 Hits

The Poggendorf Lecture 2013

Tuesday, 13 August 2013

"Biodiversity and the future of agriculture" - Professor Geoff Gurr

After a hiatus of 20 years, the Poggendorf Lecture was delivered in conjunction with Charles Sturt University, Orange, on Tuesday, 13 August 2013. The lecture was delivered by Professor Geoff Gurr, a biologist and entomologist and Professor of Applied Ecology at Charles Sturt University, where he specialises in the utilisation of natural solutions to control agricultural pests to partially or completely replace synthetic pesticides.

The population of the world is increasing by 170,000 souls per day. Currently, 40% of land is used for some agricultural purpose and the demand for agricultural products is expected to increase not only as a consequence of population growth but by the increasing living standards of people in the developing world. For example, the growth in meat demand is very strong and it takes 10 kg of grain to produce 1 kg of animal protein. This leads to the conclusion that food production needs to double by 2050. The so-called "green revolution" of the last few decades has enabled the increase in food production to largely match population growth, largely through the application of nitrogen, phosphorus, some trace elements, water and the wide-scale use of pesticides. But was this revolution truly "green"? Human inputs are largely non-renewable but, importantly, do not actually address the root cause of the problem – pest outbreaks are not due to a lack of pesticide, they are due to other imbalances in the environment. So the world is faced with a "wicked problem" of seeking food security, having finite renewable resources, a declining availability of agricultural land, changing climate and a moral obligation to preserve biodiversity (human activity, including agriculture, causes biodiversity loss at a rate about 10,000 times greater than the background rate).

Sustainable agricultural practices that are emerging can be considered in three areas: genetic (utilising the natural defence mechanisms identified in certain species and transferring these to other species); species (utilising the natural enemies of pests in order to control population); and ecosystems (developing landscapes that have high biodiversity that tends to equilibrate around sustainable species populations).

The thrust of Professor Gurr's work is that by integrating diverse approaches, including biological, cultural and chemical controls, hazards to humans and the environment can be minimised and, in many cases, productivity of agricultural systems can be improved. The principle underlying this is the acknowledgement that agricultural landscapes benefit from biodiversity and that this has significant benefit in terms of ecosystem services such as pollination of crops, reducing erosion, reducing contamination of water courses with excess nutrients and biological control of crop pests.

Generally, the greater the biological diversity, the fewer the pests. This is because the natural activity of predators, parasites and pathogens maintain potential pests' population densities at a lower level than would occur in their absence. In the case of monocultures, this balance is often upset, enabling the density of pests to get to plague proportions. The widely accepted agricultural response to this is to use synthetic pesticides which often exacerbate the problem by further reducing biological diversity. In turn, the levels of artificial agents required to control pests increases with the consequent damage to the environment.

Professor Gurr described an example in China where rice production was being severely affected by a particular species of plant hopper. This species had evolved resistance to insecticides and was substantially reducing rice yield. Professor Gurr's group investigated the use of bund walls used to retain water in rice fields to plant vegetation selected because it was a host to predators for this species as plant hopper. They also introduced another species of plant hopper that did not affect rice yield and attacked the pest species. In addition, they planted species of flowers that attracted parasitic wasps that attacked the pest species. The result was a substantial reduction in the pest species, leading to significantly increased rice field, with secondary benefits, for example increase in the frog population.

There is a common misconception that this type of biological control can have negative impact on yield but a meta-analysis of 286 projects demonstrated an average 80% increase in yield. The "green" approach to pest management potentially could double food production in 10 years: the challenge is to identify the value of ecosystem services and how to utilise them.

Historically, agricultural science has focused on agricultural production and environmental science has focused on protecting the environment – these have coexisted almost as separate disciplines. If food security is to be accomplished in the next few decades, there needs to be an integration of agricultural and environmental protection practices. China has been very active in this. 24% of agricultural land in China has been allocated some form of conservation status. Similarly in Europe, there is a trend towards farmers being encouraged to consider themselves as stewards of the land, rather than owners.

Regrettably, Australia is not leading the way in this area. Nonetheless, there are examples of this type of approach such as "alley farming" that provide shelter for natural species and encourages biological diversity thereby reducing significantly the requirement for synthetic pesticides.

Professor Gurr concluded by observing that the world cannot double food production with the current agricultural practices – they are simply unsustainable. If we learn to value ecosystem services, in particular recognising the importance of biodiversity, doubling food production, a requirement to feed the projected world population is both achievable and potentially beneficial to the global ecosystem.

148 Hits

1213th Ordinary General Meeting

Wednesday, 7 August 2013

"How numbers came to rule the world: the impact of Luca Pacioli,
Leonardo da Vinci and the merchants of Venice on Wall Street" - Jane

At the 1213th meeting of the Society at the Powerhouse Museum on Wednesday, 7 August 2013, Jane Gleeson-White outlined the argument she presented in her best-selling book Double Entry, the history of the impact of double-entry accounting on the development the capitalist model that has shaped Western civilisation.

Until the 13th century, the prevailing arithmetic system used in Europe was the Roman system which largely precluded complex operation such as multiple cache on and vision. During the Renaissance, the Hindu-Arabic number system and algebra was introduced. One major figure in this was Luca Bartolomeo de Pacioli, a Renaissance monk and mathematician, a colleague of Piero della Francesca and Leonardo da Vinci.

Pacioli wrote a number of major texts on mathematics and was one of the great influences on the development of maths during the Renaissance. He lived for a time in Venice and the merchants there were quick to introduce his system of double-entry book-keeping to record their mercantile transactions. (The double-entry system requires there to be two accounts for every transaction: one a credit account, the other debit account. For every creditor there must be a debtor; and for every debtor there must be a creditor.)

Although merchants had recorded their transactions from Phoenician times, these records were largely narrative in nature. The merchants of Venice were able to abstract and summarise financial performance into a single accounting system that was independent of the goods being transacted. Over the next couple for centuries the double-entry bookkeeping system was adopted first throughout Europe and into the rest of the world.

Gleeson-White argues that this innovation was fundamental to the development of capitalism and the consumer-oriented economic system that prevails worldwide today. It led to the system of national accounts that is used by governments that distils all human activity into a single number: gross domestic product or GDP. She further argues that double-entry book-keeping was a major influence on the scientific revolution and that together these led to the industrialisation of the world and the unsustainable stress that it is currently facing. These claims are not uncontentious and there was a lively discussion after the talk.

Jane's talk was broadcast by the ABC on Radio National's Big Ideas on Tuesday 3 September 2013. Click 1213th OGM to download the RN broadcast.

156 Hits
Site by

Privacy policy  |  Links to other societies

All rights reserved; copyright © The Royal Society of NSW.