Royal Society of NSW News & Events

Royal Society of NSW News & Events

1219th Ordinary General Meeting

Wednesday, 5 March 2014

"Big data knowledge discovery: machine learning meets natural science"

Professor Hugh Durrant-Whyte FRS, CEO, National ICT Australia

Hugh Durrant-Whyte is an internationally-recognised expert on the analysis of "big data" – the mass of information that is being generated around current information and communication technologies. Much of this is "metadata" – data that is captured as part of some activity (for example, when a digital photograph is taken also recording camera settings, capture date etc or the data kept by telecommunication companies every time a mobile phone call is made).

2.5×1018 bytes of data are generated every day – there is immense value in mining this data but this requires sophisticated analytical techniques. "Data analytics" is the term coined for technologies to analyse this data in areas as varied as the finance industry, the health industry, planning infrastructure, failure analysis in mechanical and electronic equipment and environmental analysis, to name but a few examples. Data analytics utilises Bayesian probability theory (named after Rev Thomas Bayes, an 18th century mathematician) to prepare quantitative models of existing data, gathering new data to address remaining problems and then updating model to incorporate both the old and new data.

Data analytics can be modelled using three types of mathematical functions: discrete functions that describe, for example, events or people's actions; finite probability functions, such as signals or locations and infinite probability functions such as spatial fields or temporal fields. As the masses of data available increase, the analysis can converge on knowledge. For example, payment patterns exhibited by individuals can be aggregated to behaviours of bank branch customers, giving an understanding of consumer behaviour. On the other side of the table, customers can utilise masses of data to take advantage of the best deals available or to customise internet-based content that they may wish to buy.

Where masses of historical data are available (for example, managing water assets) readily available historical parameters can be analysed for such applications as predicting equipment failures. In the case of water asset management, pipe age, soil type etc can be analysed to give a probabilistic analysis of when a water main might fail.

The mining industry has invested large amounts of money in developing systems to utilise masses of existing information to automate mine operation. This can take all available data around the surface of the mine, the subsurface, mapping, drilling, to create a completely integrated data model into a single, real-time representation of the mine.

The purpose of National ICT Australia (NICTA) is to utilise these data analytics approaches to produce leading-edge technologies and models for such varied applications as financial modelling, creating large-scale fully integrated data maps of regions (perhaps even as large as continental Australia). There is also a particular focus on data-driven discovery in the natural sciences in applications as varied as modelling ancient plate tectonics to predict mineralisation (on a timeframe of as much is 1.5 billion years) or ecological modelling, for example, predicting the growth of trees. Ultimately, these may be able to be integrated into one massive model of the Australian continent.


1220th Ordinary General Meeting

Wednesday, 2 April 2014

"The Jameson cell"

Laureate Professor Graeme Jameson AO

At the 1220th ordinary general meeting of the Society, Laureate Professor Graeme Jameson described the development of the Jameson cell, one of the most important technological contributions to the Australian economy in the last 50 years.

The Jameson cell is a flotation cell used to concentrate the valuable components on ore in minerals processing. In a typical mining operation, the first two stages of extracting minerals are the mine itself from which the ore is recovered and the concentrator, where the valuable mineral is extracted from the rest. Generally, the valuable components are no more than 2% of the ore recovered, so there is a massive challenge in isolating this from spoil for further processing. An important technology developed to achieve this concentration step was the flotation cell, a process first developed early in the 20th century.

In a flotation technology, the ore is ground up into very fine particles and dispersed with water and surfactants in a large mixing vessel that can be kept agitated and into the bottom of which compressed air can be introduced. Reagents are added to make hydrophobic the valuable mineral particles exposed during the crushing. Air is bubbled through the suspension and the hydrophobic mineral particles attach to the bubbles, float to the surface as a froth and then are skimmed off for further processing and enrichment. Because large volumes of ore have to be treated to recover a relatively small proportion of valuable product, this is a very expensive step in recovering minerals: first, the ore has to be ground to very fine particle sizes (typically around 150 micrometres) – this takes a lot of energy; and second, the volume that has to be treated in preparing the slurry is large, so processing equipment is big and expensive. Any technology that reduces either the cost of grinding or the size of the processing equipment can have a major impact on the cost of production. The Jameson cell revolutionised the floatation process by reducing the size of the equipment needed to efficiently float off the minerals.

Over a period of several years, Professor Jamieson identified the optimum parameters for particle size and the corresponding optimum size for the air bubbles used to float the treated particles. Generally, particle size needs to be less than 150 micrometres, or, even better, less than 100 micrometres. The smaller the particle, the more likely it is to consist of the pure mineral. But the real technological breakthrough was identifying that the optimum bubble size is about 300 micrometres. Until then, conventional cells operated using bubbles about three times that size at about 1 mm diameter. Having identified the optimum bubble size, the challenge was then to design equipment that produced the right amount of sheer to generate bubbles of 300 micrometres diameter . This turned out to be relatively simple, using high pressure jets of water to entrain the air.

Much of the commercialisation work was done at Mount Isa in the 1980s and 1990s. Since then, the cell has been deployed around the world and is used routinely to extract coal, copper, lead, zinc and potash and is used in other industries such as oil-sands extraction and industrial waste treatment. The over 300 cells have been installed and the cumulative value created by this invention is more than $25 billion.

Professor Jameson was named NSW Scientist of the Year in 2013.


1194th General Meeting

"Schizophrenia: from neuropathology to new treatments"

Professor Cyndi Shannon Weickert, Macquarie Group Foundation Chair of Schizophrenia Research, Neuroscience Research Australia and UNSW, and Professor, School of Psychiatry, UNSW

Wednesday 3 August 2011 at 6.30 pm

Seminar Room 102, New Law Building, University of Sydney

Is schizophrenia caused by genes or environment? This question was posed by Professor Cyndi Shannon Weickert at the 1194th ordinary general meeting of the Society. 

Schizophrenia was first formally classified in 1887. Despite extensive pathological investigation there was no clear distinction identified between the brains of people who have schizophrenia and those who do not. Until 1930s it was considered to be primarily a behavioural disorder put down to bad mothering. But in the 1930s treatments involving insulin and shock therapy were shown to be somewhat effective. There was a breakthrough in 1952 when D2R blockers were introduced and found to be effective against some of the symptoms. However, it was not until 1988 that the first definite genetic link was established by progress was swift and in the last decade it has been shown that there may be several hundred genes involved in the disorder. Because of the large number of genes that are implicated, identifying treatments that target these genes is extraordinarily complex. Most researchers in the field now believe that the disease has both environmental and genetic origins. 

The approach taken by Professor Shannon Weickert's group is to attempt to identify the pathology of various genetic pathways to the disease, in particular identifying molecules that can be new drug targets. Once these have been postulated, the aim is to use existing drugs which are either known to or believed to affect those targets and then to test their effect in clinical trials. This approach has the advantage of using drugs that have already been approved for use in humans thereby avoiding the necessity for time- consuming and expensive early-stage clinical trials that establish general parameters such as toxicity and dosage levels. 

One notable aspect of schizophrenia is that it is virtually never found in children prior to adolescence. Most cases of schizophrenia are diagnosed from mid-teens to the early 20s but, interestingly, there is a second peak among women at menopause. This suggests that sex hormones could be an important part of the mechanism causing the disorder. Oestrogen receptors are found in the human cortex and act as "transcription factors", that is, they transport proteins across the cell membrane into the nucleus of the neuron. On investigating oestrogen receptor proteins a mutation specific to schizophrenia has been found in a transcription factor protein called ESR1. This protein cannot bind to oestrogen and hence cannot pass hormonal signals into the nucleus of the cell. Hence, the cell cannot activate important genes that produce their normal proteins and this may cause some of the symptoms of schizophrenia.

 An existing drug, raloxifene, has already been approved as a selective oestrogen receptor modulator for treating various disorders in postmenopausal women. Raloxifene has been found to stimulate the oestrogen receptor and overcome the mutant effect in the ESR1 gene. However, the great variability of genes means that the drug effect on one specific mutation is likely to be masked, so there needs to careful design of clinical trials to make the effect apparent. One such trial is currently being conducted by Professor Shannon Weickert's group and involves a double-blind trial in which patients and control groups are treated in two stages, with all trial participants receiving the drug in one or other of the stages. This clinical trial is still under way and is expected to be completed towards the end of this year. If successful it may be a major step in establishing personalised drug treatments for the 1% of the human population that currently suffers the debilitating effects of schizophrenia. [The May 2009 edition of the ABC's Australian Story was on Professor Shannon Weickert's work and is available at Anyone interested in the clinical trial may be interested to read the transcript or to view it.]


1222nd Ordinary General Meeting

Wednesday, 4 June 2014

"What lessons have we learnt from the Global Financial Crisis?"

Professor Robert Marks

In 2008, the world suffered "the equivalent of cardiac arrest", according to the Financial Times. It became virtually impossible for any institution to finance itself, (that is, borrow in the markets) longer than overnight. With the collapse of Lehman Bros, interbank credit markets froze and counterparty risk was considered to be too great for prospective lenders to take on the transactions. The London interbank overnight lending rate, typically in the range of 0.2% to 0.8% spiked to over 3%. This situation raises two questions: what caused this global financial crisis (GFC)? and how can we attempt to avoid similar crises in the future? The origins of the crisis go back more than 30 years.

Starting in 1977, there were substantial changes made to US investment legislation. Early in this period, the aim was to make finance more readily available to low-income borrowers, to progressively eliminate using the controls on mortgage rates and to remove discrimination in the US housing market. In 1999 and 2000, there was substantial deregulation, with substantial changes to long-standing legislation, in particular the repeal of the Glass-Steagall act of 1933 that had imposed restrictions banks during the Great Depression. There were also reforms to the Federal housing finance regulatory agencies, loosening their lending requirements.

This period of financial deregulation encouraged consolidation and demutualisation of many financial institutions that had been mutually or privately owned, with these being floated as public companies. Whereas previously their lending practices had been conservative as they had been risking their own money, now the money at risk belonged to other people! There was also great creativity in developing new financial products and instruments: Mortgage-Backed Securities (MBS), structured investment vehicles, Credit Default Swaps (CDS) and Collateral Mortgage Obligations (CMO).

In the early 2000s, the September 11 attacks, coming not long after the bursting of the "tech bubble", led to a prolonged period of low interest rates. US fiscal policy was heavily in deficit leading to massive issuance of US bonds that were largely bought by China and other Asian countries. At the same time there was further financial deregulation, relaxing capital requirements that encouraged higher gearing financial institutions.

Unsurprisingly, firms responded to the incentives put before them. The market for the new financial instruments boomed and rating agencies responded by changing the way in which they charge for their services – they began charging the firms whose products they were rating, rather than the potential buyers of the product. In the US, the financial sector grew from 3.5% of GDP in 1960 to nearly 8% of GDP in 2008.

Drawing these strands together, there were four causes of the GFC: the repeal of the Glass-Steagall act; the decision by Congress not to regulate derivatives; the relaxation of regulations that allowed banks to expand their gearing; and the change by the ratings agencies to charge the issuer rather than the buyer of rated products.

How likely is this type of situation to occur again in the near future?

Unfortunately, a number of European countries may be facing similar challenges unless they take steps to avoid the problems that the US experienced. Fortunately, Australia avoided the worst of the GFC, well-served by the "four pillars" banking policy. However, there needs to be recognition that information is asymmetric and that the issue is really not one of risk but rather of uncertainty, where there are no simple answers. As George Santayana observed in 1905, "those who cannot remember the past are condemned to repeat it".


1223rd Ordinary General Meeting

"What causes MS? The impact of the genetic revolution"

Professor Graeme Stewart AM

Wednesday, 2 July 2014

Professor Graeme Stewart AM, director of clinical immunology at Westmead Hospital, has researched the genetic influences on disease, in particular on multiple sclerosis (MS). MS is the commonest chronic neurological disorder of young at all. It usually starts with a relapsing/remitting phase (symptoms occur and then go into remission for extended periods), commonly with onset at about the age of 30. The disease can be relatively benign with periods of disability, it can present as a relapsing/remitting disease with gradual increase in disability, or in about 10-20% of patients it can present as being "primary progressive", where disability progressively increases over time. MS is caused by the body's immune system malfunctioning – macrophages devour the myelin sheath around nerve cells, exposing the nerve axon and thereby disrupting the flow of information along the nerve cell. The body is able to repair the damage by re-myelinating the nerve cells after this initial attack however if the myelin is attacked the second time in the same place, the body is unable to repair the sheath and relapse occurs. Hence the symptoms of the disease progress.

The important question is: what causes this? MS is a disease which is clearly influenced by genes and environment. Studies of the disease in identical twins show 30% concordance, whereas in fraternal twins there is 4% concordance. The background incidence rate of MS is 0.4% of the population. This suggests that genetic influences are very significant but environmental factors are also a consideration. The interesting environmental effect is that the incidence of MS is quite highly correlated to latitude – for example, in Australia there is a 4 to 7 times hazard ratio between North Queensland and Tasmania – in the southern hemisphere, the further south you live, the more likely you are to contract MS. The most likely reason for this is the reduced exposure to UV-B light the further you are from the equator and vitamin D deficiency. Other environmental factors include smoking and exposure to the Epstein-Barr virus that causes glandular fever (almost all MS patients have been affected with Epstein-Barr virus). The fact that Epstein-Barr virus is implicated in virtually all MS cases may present an opportunity for treatment if the effect of this virus on DNA is understood.

Since the early 1970s, there has been a search for the genes implicated in MS. The first was found in 1972 but it was not until 2007 that the second gene was identified. Since then, as a consequence of the human genome project and widespread sequencing technology, together with the recent advances in computer power and statistical algorithms to handle large amounts of data, there have been over 100 genes identified. Pursuing genetic associations is expected to give insight into the pathogenesis, in particular the interaction between genes and environment. It is hoped that this will lead to interventions to prevent the disease from progressing. In addition, identifying genetic biomarkers may provide major opportunities for new treatments, including personalised treatments based on the individuals genetic profile.

There has been substantial progress in treatments for MS, including trials of drugs to stop T cells crossing the blood-brain barrier, drugs that capture lymphocytes and hold them in the lymph nodes and early indications that drugs targeting specific proteins identified through genetic analysis might be useful. In addition, trials are underway to see whether large doses of vitamin D might have some impact and whether increased exposure to ultraviolet light might also offer some improvement.


1193rd General Meeting

"Stem cells and regenerative medicine: prospects for realising the Prometheus myth"

Professor John Rasko, Centenary Institute

Wednesday 6 July 2011, 6.30 for 7 pm

Lecture Theatre 106, New Law Building, University of Sydney

Professor John Rasko was appointed to the first clinical gene therapy position in Australia. Currently, he is head of the gene and stem cell therapy programme at the Centenary Institute and is a Professor in the faculty of medicine at Sydney University. At the general meeting of the Society on Wednesday 6 July, Professor Rasko gave a wide-ranging talk on the status of cellular therapies for regenerative medicine and cancer treatment and the potential and use of both embryonic and adult stem cells in the treatment of a wide range of diseases. Importantly, there was a comprehensive discussion on the ethical issues in relation to the use of both embryonic and adult stem cells, not only in the treatment of disease but also the implications for technologies such as in-vitro fertilisation. 

The Centenary Institute has a large research programme for cellular medicine and extraordinarily sophisticated facilities for the manufacture and cultivation of biological material. This includes four specialised laboratories with hyper-pure air flow, positive air pressure differentials, both for preventing contamination and the release of biologically-active material and sophisticated human-access protocols. 

Research programmes include techniques such as extraction and cultivation of red cells and bone marrow prior to treatments such as chemotherapy and radiation therapy in order to speed up patient recovery and exploring the extent to which adult stem cells might be used as a source of genetic material to help rebuild damage organs such as the liver, and blood and bone marrow. 

Professor Rasko explained the potential of embryonic stem cells that, one day, might be used to treat a range of diseases by replacing damaged or diseased tissue. Embryonic stem cells are taken from embryos at the time when little differentiation between cells has yet taken place. These stem cells theoretically are able to be cultivated and differentiated as "master cells" that could produce all types of fully-differentiated tissue in the body. Despite the theoretical potential for these, progress has been slow. The medical drawbacks of using embryonic stem cells are that the embryo has the same immune signature as the mother and father, so recipients of tissue cultivated from these cells would mean constant immunosuppressant therapy. So far, only three clinical trials have been undertaken. But a more significant challenge for embryonic stem cells may well be the moral issues. Pro-life groups have opposed the use of embryonic stem cells because they believe that cells taken from an embryo have the potential to form a complete individual; to them, destroying such an embryo amounts to taking a human life.

 An alternative to using embryonic stem cells is to take adult cells and to reprogram them to make them the same as embryonic cells. In the last few years, there appears to have been significant progress in making so-called "induced pluripotent stem cells". If this technology turns out to be viable (and there are still many challenges that have been identified), it could solve a number of the issues of embryonic stem cells. For example, because adult cells could be taken from the individual requiring treatment, induced pluripotent stem cells could be used without the need for immunosuppressant therapy. It also avoids the moral controversy that surrounds the destruction of an embryo. But this may not be the panacea that many had hoped for. There are two medical disadvantages in "reprogramming" cells: two of the transgenes required are oncogenic so pose an enhanced risk of inducing cancer in the patient; and the viruses used to carry the transgenes can be incorporated into the genetic material of cells and the consequences of this are unpredictable. It also appears that the older the organism, the more abnormalities there are in the resultant stem cell. Furthermore, the ethical issues may not be solved either. To date the efficacy of all adult stem cells need to be compared against embryonic stem cells, so the research programmes cannot be separated. In addition, there is the moral argument that if every cell in your body has the potential to produce a full range of differentiated cells (perhaps even a fully-formed individual) then it could be argued that every cell in your body has the same moral status as you do!


1224th Ordinary General Meeting

"Saving Australia through science education"

Emeritus Scientia Professor Eugenie Lumbers AM DistFRSN

Wednesday, 6 August 2014

The world is experiencing an exponential rate of technological progress. Change was relatively gradual from the time and the domestication of the horse until the 17th century. Indeed, in the early stages the Industrial Revolution, industry was still heavily dependent on horse-drawn transport. In 1900, just 14 years after the invention of the motor car, there were still 300,000 horses in service in London. That same year, there were 0.11 cars per thousand people in the US; in 2009 there are 828. This enormous, rapidly accelerating technological change took place as a consequence of science and its application in development of technology. The question is why was such enthusiasm for science in the 1940s and 1950s but this has largely disappeared today in many countries, not the least of which is Australia. This poses a major challenge for Australia – how will we keep up with technological progress when few people are interested in seeking a science or technological education? Despite the apparent interest in science, in a multitude of TV programmes for example, this is actually positioning science as entertainment, not as true science.

Despite this rapid shift away from science, Australia was still doing well by international standards until the late 20th century. In 2000, Australia ranked number three in the world (after South Korea and Japan) in the OECD Programme for International Student Assessment (PISA) test, a test that measures problem-solving capability in 15-year-olds. In the latest test, in 2012, Australia ranks number 8 (after Singapore, South Korea, Japan, China, Hong Kong, Taipei and Canada). It is not surprising that Australia's ranking is slipping when only 51% take a science subject in year 12 and less than 20% studied chemistry or physics. (Interestingly, biology is somewhat higher at 25% because it is seen as being "less academic".) What will the future look hold when the technologically-educated people of today are gone? It is extraordinary that 76% of Australians do not see science directly relevant to themselves but important to Australia's future.

The Academy of sciences tried to address this through its "Primary Connections" programme and inquiry brace programme to help teachers develop their teaching programmes and to provide curriculum resources. Similarly, the Academy's "Science by Doing" programme to secondary schools is aimed at stimulating the all-important interest and enjoyment in science for children in early secondary school so that they go on to choose a career in science.


1192nd General Meeting

"Variation of fundamental constants: from the Big Bang to atomic clocks"

Professor Victor Flambaum, School of Physics, University of New South Wales

Wednesday 1 June 2011 at 6.30 pm

Lecture Theatre 106, New Law Building, University of Sydney

Modern unification theories suggest the fundamental constants (like the speed of light) may change in an expanding Universe. The study of quasar spectra has indicated the variation of the fine structure constant alpha in space (alpha is the dimensionless combination of electron charge, the speed of light and the quantum Planck constant). This spatial variation could explain the fine tuning of the fundamental constants which allows humans (and any life) to appear. If the fundamental constants were even slightly different, life could not exist. We appeared in the area of the Universe where the values of the fundamental constants are consistent with our existence. 

These astrophysical results may be used to predict the variation effects for atomic clocks. These effects in atomic clocks are very small and require extremely high precision. Therefore, we are searching for the enhanced effects of the variation. One of our proposals is to use a nuclear clock where the effect is enhanced by five orders of magnitude.


1191st General Meeting

"Heading towards the world's largest telescope: the Square Kilometre Array"

Professor Michael Burton, School of Physics, University of NSW

Wednesday 4 May 2011 at 6.30 pm

Eastern Avenue Auditorium, University of Sydney

Meeting report by Donald Hector 

What do the kinetic energy of a falling snow-flake and radio telescopes have in common? Well, as Professor Michael Burton pointed out in his talk on the Square Kilometre Array (SKA), the energy of a falling snowflake is about 30 microjoules and this is greater than all of the radio energy ever collected by all the radio telescopes in the world! These instruments are very sensitive! Radio astronomy looks at a part of the electromagnetic spectrum at wavelengths from 1 m to 1 km. Observations in the visible spectrum are badly affected by dust but this is not the case in the radio spectrum. Thus by combining information from optical, infrared and radio telescopes we can get a much more complete picture of what's going on in the universe. But because of the long wavelengths of radio waves, these instruments have to be very big. For example, the Parkes telescope with its 64 m diameter dish has an area of about 1000 m². This telescope can resolve galaxies but in order to increase the resolution to look inside galaxies, much larger instruments are needed. 

The largest radio telescope in the world at the moment is the Very Large Array (VLA) in New Mexico. This instrument has 27 dishes each of 25 m diameter, with a total area of 10,000 m². These antennas are configured in a Y-shaped that can deliver an effective maximum baseline of 32 km. Data from each array is integrated using interferometry techniques effectively giving a telescope of this aperture. This substantially increases the effective resolution of the instrument. The VLA is capable of looking at radio sources such as pulsars, quasars and give insights into the formation of galaxies. If we are to be able to look further back through the history of the universe to the dust from which galaxies form, we need instrument orders of magnitude bigger than the VLA and that's where the SKA comes in. 

The SKA will be the largest telescope ever built with a collection area of 1,000,000 m² and a baseline of at least 2,000 km. The SKA will be able to peer far back into the history of the universe to observe the first black holes and stars, to search for Earth-like planets, to test aspects of general relativity, and to explore the origins of cosmic magnetism. 

The total cost of this project will be about $3 billion and the telescope is expected to be in full operation by 2025. Because of the cost of the project, up to 20 countries will be involved in the investment. About $450 million has been invested so far with prototype technologies being constructed in potential locations for the final instrument in southern Africa and Western Australia. The core instrument (where most of the dishes are located) needs to be sited in a "radio quiet" location. It needs to be flat, open, geologically stable and well away from man-made sources of radio waves. Final site selection is expected to be complete next year. The telescope will come on-line over a period of about 10 years, with the low and mid-frequency capabilities completed by 2023 and the whole instrument by 2025. Australia is well placed to be selected as the final site, given our leadership in radio astronomy and the 'radio quietness' of outback Australia. If Australia is chosen, the core instrument will be located at the Murchison Radio Observatory about 500 km north-east of Geraldton and will have dishes extending from Western Australia to New Zealand, giving a total baseline of 5,500 km.


1225th Ordinary General Meeting

"The fourth dimension and beyond - the paradox of working in unimaginable worlds"

Emeritus Scientia Professor Ian Sloan AO FRSN

Date: Wednesday 3 September 2014

Professor Ian Sloan is not content to work in an environment of four dimensions – he is quite at home in space with many more dimensions than most of us are accustomed to. Many mathematical problems can be considered as problems and multidimensional space – the question is how do we imagine these environments? The dimensions of a space can be considered to be the number of directions that you can go from any single point within it. For example, in our four-dimensional world, from any point we can go in three spatial directions, plus time. If we are in a six-dimensional environment, we can go in six directions from any given point and mathematically we don't need more than six variables to describe this environment. But why would we be interested in multidimensional spaces?

Many problems are best analysed in multi-dimensions. For example, the shop may have 250 stock items. This can be thought of as a single point in 250-dimensional space. Each stock item has a price, so there are another 250 dimensions to consider. One area where this approach has a major application is evaluating certain types of financial transactions such as derivatives.

An investor may want to analyse a potential investment in Woolworth shares, for example. The payoff might be thought of as the closing share-price over a period of 250 trading days in, say, $10 increments. Using multidimensional mathematics, the investor can calculate the expected payoff at a certain trading day on the basis of what the closing share price might be. Such calculations soon become extremely complex, in fact, too complex to be evaluated (this is known as "the curse of dimensionality" a term coined by Richard Bellman, a noted researcher in this field). And what if the payoff can take any value, rather than being in $10 increments – does this make it even harder? Well, fortunately it does not.

Using a statistical approach known as the Monte Carlo method, these highly complex functions can be evaluated quite accurately. The Monte Carlo method may be thought of as a technique whereby one randomly throws points at a target and evaluates whether the points fall on the target or outside it. If the target can be mathematically described, the functions can be evaluated with considerable accuracy after no more than a few thousand random "throws". This enables highly-complex derivative functions to be evaluated quite accurately. The problem then lies in the assumptions underlying the mathematical model used to define the "target". Flaws in the underlying assumptions have resulted in many a lost fortune!

Despite the accuracy of the Monte Carlo method, with highly complex functions the number of random throws can become very large and the question arises, can we do better than generating the throws randomly? For some problems, using a pattern, for example, a lattice has been shown to converge much more quickly.

Multidimensional mathematics is one of our most powerful tools in solving problems from financial derivatives, to metadata analysis, to cosmology. Professor Sloan provided a particularly clear insight into a highly complex and very powerful mathematical technique.


The Dirac Lecture 2011

"Beauty and truth: their intersection in mathematics and science"

Robert, Lord May of Oxford, AC FRSN

Friday 29 April 2011 at 6.30 pm

Scientia Building, University of NSW

Meeting report by Donald Hector 

On 29 April 2011, Robert Lord May of Oxford, arguably the greatest mathematician that Australia has produced, was invested as a Fellow of the Royal Society of NSW by the Governor. Earlier that day, Lord May presented the Dirac Lecture at the University of New South Wales, jointly sponsored by the Society. He took us on interesting exploration of some of the important concepts of mathematics, from Euclidean geometry via the concept of imaginary numbers to the mathematics of fractals and chaos theory and the extraordinary power of mathematics to describe observed real-world phenomena. 

Updating the observation by Galileo, "this grand book is written in the language of mathematics, and its characters are triangles, circles and other geometric objects", Lord May pointed out that rather than triangles and circles, today the mathematical objects are more likely to be fractals and "strange attractors". Nonetheless, as Galileo observed, and referring to the examples of Julia sets and Mandelbrot sets, there is great beauty in the elegance with which we can both describe and understand the immense complexity of the universe. He went on to explore the paradigm shift that Einstein divined from the results of the Michelson-Morley experiment that had found that the speed of light was the same for all observers. Einstein's formulation of the special theory of relativity led to a profound shift in our understanding of the relationships between momentum, mass and energy that has enabled extraordinary insights and understanding of the nature of the universe, from gravity to nuclear fission. Lord May pointed out that, regrettably, many of the great contributions do not get the recognition that they deserve. In his view, Paul Dirac was such a person – his formulation of the Dirac equation and its implication of the existence of positrons was one of the greatest steps forward in theoretical physics in the 20th century, yet his name is nowhere near as well known as that of Einstein. 

Quoting Keats "beauty is truth, truth beauty – that is all ye know on earth and all ye need to know", Lord May observed: well yes, but not really.


1226th Ordinary General Meeting

"Australia's most spectacular environmental rehabilitation project: Phillip Island, Pacific Ocean"

Dr Peter Coyn

Date: Wednesday, 1 October 2014

Perched atop a submerged seamount, in turn atop a submarine ridge, Phillip Island and its close neighbour Norfolk Island are tiny specks, the only land in a vast expanse (2.5 million square kilometres) of the southwest Pacific Ocean. Both islands were created by volcanic activity between 2.8 and 2.2 million years ago. The plateau top of the seamount, 100 x 35 kilometres, is between 30 and 75 metres below present sea level. Sequential ice ages during the last 2 million years exposed the entire plateau, an area about 100 times the size of the present islands. Such an area could have accommodated about four times as many species as the present islands. During the last ice age the entire plateau was exposed for 24,000 years until 13,000 years ago. Sea level 25 metres higher, reached 10,000 years ago, still exposed an island about 35 km long, large enough to accommodate more than double the species count of the present islands and joining these islands with dry land. An island at least this large was exposed for 60,000 years during the last ice age, before the sea reached its present level just 6,000 years ago. The generally much larger size of the islands and the ecological stress caused by their declining area, and the consequent loss of three-quarters of their species, between 13,000 and 6,000 years ago, could help explain the great biological value of the islands and Phillip Island specifically.

Phillip Island was densely vegetated when pigs were released there in 1793, followed by goats and rabbits by 1830. The feral grazers quickly destroyed the vegetation and by 1860 the island was mostly bare. Photographs dated 1906, when only rabbits remained, show landscapes almost identical to those of 1980 — almost no vegetation was present. In 1979 Dr Coyne began a three-year experimental program to investigate the effects of the rabbits and potential for vegetation re-establishment. The work was physically difficult and often hazardous. The first year's results were enough to persuade decision-makers the rabbits should be eradicated. That work began in 1981 and by 1986 the rabbits had been destroyed by a combination of an artificial strain of myxoma virus, poisoning, shooting, trapping and fumigating. The eradication program required swimming to habitat accessible only from the sea, archery to distribute the myxoma vector (rabbit fleas) to other inaccessible areas of habitat, and a lot of rock climbing on cliffs to 250 metres high. Since then the island has been transformed by new vegetation, most arising spontaneously. Some of the world's rarest plant species have been discovered, rediscovered or have increased in numbers. One has only a single genotype, two have fewer than fifty individuals and another has fewer than 250 individuals. A genus and species endemic to Phillip Island sadly was not rediscovered and at least two Phillip Island plants are extinct. Fauna have also benefitted from the revegetation, and being free of rats and cats the island has potential as a refuge for threatened fauna endemic to Norfolk Island.


1227th Ordinary General Meeting

"A drop of optics"

Dr Steve Lee and Dr Tri Phan, joint winners of the 2014 ANSTO Eureka Prize for Innovative Use of Technology

Date: Wednesday 5 November 2014

The talk at the 1227th AGM was presented by Dr Steve Lee and Dr Tri Phan, joint winners of the 2014 ANSTO Eureka Prize for Innovative Use of Technology. They received the award for developing a very inexpensive polymer lens with extraordinarily high resolution that can be used on cameras like those found on mobile telephones.

In recent years, miniaturisation has revolutionised sensors: small image sensors means that the optical device can also be miniaturised and it is much easier to get good optical qualities in a small lens than a large one. The early miniaturised lenses were ground from small pieces of glass and were quite expensive to manufacture. However, with the development of polymers with good optical qualities, high-quality lenses can now be moulded rather than being ground. But if surface tension is allowed to create the lens surface rather than moulding, surface roughness (which is almost impossible to avoid with any moulding process) can be largely eliminated.

A familiar example is the optical quality of raindrops but if the liquid used to form the lens is of much greater viscosity than water, for example, viscoelastic polymers, gravity can be used to shape the surface of the lens to give specific optical properties. The technique that Dr Lee and Dr Phan developed was to use highly viscous polydimethylsiloxane in as the polymer (this is also referred to a silicone polymer) and to suspend droplets from a small orifice so that gravity forms droplets with curvature that has the right optical characteristics. The silicone polymer can be cross-linked and thus stabilise it shape simply by putting it into an oven to cure. Different lens geometries can be obtained by applying several layers of polymer with intermediate curing steps.

One application that Dr Lee and Dr Phan have developed is to use these lenses to clip onto the cameras on mobile telephones. Standard lens on mobile telephone (these are moulded) has a roughness of 200 nm whereas the elastomer lens is around 10 nm. Consequently, much finer detail can be resolved using the polymer lens. The opportunity is to integrate lenses such as these into smart phones and use these for diagnostic and remote sensing applications.

The technology was just released at the Google "The Mobile First World" conference in Taiwan.


1189th General Meeting

Lecture delivered for the Two Societies Meeting:
"Searching for nanosecond laser pulses from outer space"

Dr Ragbir Bhathal, University of Western Sydney

Tuesday 22 March 2011 at 6 pm

School of Physics, University of Sydney

Meeting report by Dr Frederick Osman 

On Tuesday 22 March 2011, the Australian Institute of Physics and the Royal Society of New South Wales held their annual Two Societies meeting at the University of Sydney and featured Dr Ragbir Bhathal on his topic of searching for very fast light flashes.

 Dr Bhathal opened his talk by saying that we should be searching for nanosecond laser pulse from ETI. He believes that ETI would have surpassed the microwave threshold and gone on to use laser pulses for intergalactic communications. A nanosecond laser pulse has several advantages, he said. Apart from its directivity, a 1015 W or more nanosecond laser pulse would outshine its star by four to seven orders of magnitude. This pulse could thus be easily detected by present day optical telescopes equipped with fast-response PMTs or APDs. Because the telescopes are being used as photon buckets they need not be highly sophisticated. The fact that the National Ignition Facility in the US has been able to generate 1015 W laser pulses although for a few nanoseconds lends credibility to the use of lasers as communication devices by ETI civilisations. The optical search strategy has been used in a dedicated mode only for the last ten years. Four groups, three in United States (Harvard University/Princeton University, University of California and the SETI Institute) and one in Australia (OZ OSETI Project at the University of Western Sydney) have led the charge for the optical search strategy. 

Dr Bhathal's optical search is the longest dedicated optical search in the Southern Hemisphere. Last year a group of Japanese scientists and engineers also joined the optical and microwave searches. However, to date no positive signals in the optical spectrum have been received. Although a laser look alike signal was detected in 2008 by Dr Bhathal emanating from the globular cluster 47 Tucanae it was dismissed after a six month search in the same region failed to detect the signal again. Considered as the father of SETI in Australia, Dr Bhathal hopes to continue the optical search with a new dedicated one-metre telescope which is on the drawing boards at the moment. Dr Bhathal also discussed the latest developments in the microwave search strategy which clocked 50 years last year and other programs which are underway for searching for life in the universe, such as searching for glycine, searching for earth like planets by extra-solar planet scientists and the Kepler mission, the Mars explorations and meteorites. 

Dr Bhathal ended his lecture by quoting from the great 19th century mathematician and physicist Karl Gauss who said that the detection of a signal from ETI "would be greater than the discovery of America". The Australian Institute of Physics and the Royal Society of New South Wales thank Dr Ragbir Bhathal for his outstanding lecture!


1188th General Meeting

Lecture delivered for the Four Societies Meeting:
"Geothermal energy - current state of play and developments"

Dr Stuart McDonnell, Geodynamics Ltd

Mr Stephen de Belle, Granite Power Ltd

Thursday 24 February 2011 at 6 pm

Hamilton Room, Trade & Investment Centre, Industry & Investment NSW, Level 47, MLC Centre, 19 Martin Place, Sydney

Meeting report by Donald Hector 

The annual meeting of the "Four Societies" - the Royal Society of NSW, the Australian Nuclear Association, the Nuclear Engineering Panel of Engineers Australia (Sydney Division), and the Australian Institute of Energy - heard two perspectives on geothermal energy as a major energy source for the generation of electric power. The speakers were Dr Stuart McDonnell, chief operating officer of Geodynamics Ltd and Stephen de Belle, managing director of Granite Power Ltd. Both companies are developing 'hot rocks' technology for the generation of electricity. The technological concept behind this technology is straightforward enough: a source of hot rock, typically granite at temperatures of 150-300 °C at depths between 1500 m and 5000 m below the Earth's surface, is identified. The rock is fractured and water is pumped under high pressure from the surface down through the hot rock where it is heated to very high temperatures. When the water returns to the surface, the energy is used to drive turbines which in turn generate electricity. The thermal resources in Australia are huge­- in the Cooper Basin alone, there are hot rock deposits capable of generating as much electricity as burning 750 million tonnes of coal or 16 trillion cubic feet of natural gas. 

Of course, the devil is in the detail. It is technologically challenging and expensive to drill to these depths. In addition, there are other significant technological challenges that need to be resolved such as fluid chemistry when the hot water reacts with the minerals in the rock and the components in the system, the ability to manage multiple fracture-zones in order to extract the maximum amount of heat, the gradual reduction of the temperature of the resource over time, and the challenges in creating viable, efficient heat exchanger designs in rock several kilometres beneath the Earth's surface. 

If these technological and economic issues can be overcome, geothermal generation is well placed to provide a substantial proportion of Australia's baseload electricity demand. This could be as high as 2,300 MW of base-load capacity by 2020. Some government and private funding has already been committed with the intention to seek further capital from institutional investors during 2011.


the Royal Society of NSW Forum 2011

"Belief and science: the belief/knowledge dilemma"

Barry Jones and David Malouf

Wednesday 6 April 2011 at 6 pm

The Darlington Centre, University of Sydney

Barry Jones
Have scientists become polarised into believers and non-believers? Barry Jones posed this question to David Malouf and members of the Society at the 1190th General Meeting​ on Wednesday, 6 April 2011. Reflecting upon this, Barry referred to the scientific paradigm that has emerged over the last several hundred years: scientists gather information in order to try to make sense of observed phenomena using rational analysis. Science has evolved to become not so much a matter of belief but rather of acceptance of the most sensible explanation based on the accumulation of evidence. Nonetheless, when major paradigm shifts in scientific thinking take place, there are often eminent experts who disagree and refuse to accept the new theory. This slows down the acceptance of a new paradigm but ultimately in most cases rational thought prevails.
David Malouf

David Malouf pointed out that non-scientists have to rely on what they are told in order to evaluate scientific theories. He pointed out the significant shift since the 18th century when early scientists put their theories to learned academies (such as the Royal Society, London) for expert examination and they determined what was accepted as scientific knowledge and what was rejected. Today, however, with the highly complex issues that society faces there are significant public policy implications that need to be resolved based on expert advice. But what do we do when the experts disagree? We are largely dependent on the media to inform us. This is further complicated because important issues are usually not just scientific in their nature but often have economic and social imperatives that commercial groups, governments and other interests seek to manipulate. Barry commented that the sheer complexity of science has forced scientists to increasing specialisation. Furthermore, scientists are heavily reliant on research grants from government and private enterprise and this has discouraged them from entering into controversies. This is quite different to the era of only 50 or 70 years ago when renowned scientists were not afraid to comment outside their area of specific expertise.

In their final comments, Barry emphasised that the task of a scientist is to analyse inconceivably complex data and make sense of it but the public policy imperatives are driven by media outcomes and necessarily requires the debate to be simplistic. David is fascinated by the rate of change of technology and almost unexpectedly has come to the realisation that the more we know about the complexities of nature, of the human body, the weather and so on, it simply exposes ever more questions. Science has been enormously successful and exciting in bringing an understanding in a world that we know so little about.

Annual Dinner and Awards 2011

The Society held its Annual Dinner for 2011 at St Paul's College, University of Sydney on Friday 18 February 2011. Our guest-of-honour was the Governor of NSW, Her Excellency Professor Marie Bashir AC CVO, one of our two Patrons and a long-standing supporter of the Society. We were also pleased to have three Deans of Science from universities in Sydney present. In her Occasional Address Her Excellency made reference to the antecedents of the Society and the work of one of her predecessors, Governor Lachlan Macquarie, in creating a climate in which Societies such as ours might germinate. We appreciate her support and that of the unbroken line of her predecessors.
The Governor, Marie Bashir, presents Fellowship to Professor Michelle Simmons.
The Governor, Marie Bashir, presents Fellowship to Emeritus Scientia Professor Eugenie Lumbers.
The Governor, Marie Bashir, congratulates Dr Ken Campbell on his award of the Clarke Medal.​
The Governor, Marie Bashir, presents Assoc. Prof. Angela Moles with the Edgeworth David Medal.​
The Governor, Marie Bashir, presents Prof. Rick Shine with the Walter Burfitt Prize.​
The Governor, Marie Bashir, presents Dr Julian King with the joint AIP/Royal Society of NSW Studentship Award.​
The Governor, Marie Bashir, with Society President John Hardie after he presented her with a token of the Society's appreciation.​
Vice President, Heinrich Hora, gives the vote of thanks.​

Soirée at the Nicholson Museum (2006)

Thursday 28 September 2006, 4.30 - 8.30pm

Flushed with success, the Council of the Royal Society decided it was time for a bit of a heels-up - or rather heads-down - to look at treasured items from our wonderful collection. The occasion? A soirée at the Nicholson Museum in The University of Sydney to celebrate the achievements of the previous twelve months.

In that time, the Council managed to win two grants. The first, a State Government grant of $30,000 to be used for the publication of the journal over the next three years, was the work of past president Karina Kelly and her team. The second was $5,500 through one of the Federal Government's Community Heritage Grants organized by the National Library in Canberra.

It was through this grant that our consultant historians, Dr David Branagan and Dr Peter Tyler, assessed the Society's collection of books, journals, maps, drawings, painting, photographs, lantern-slides and medals to be highly significant both historically and scientifically. Selected items were displayed in the facilities offered by the Nicholson Museum. Staff and guests donned cotton gloves to leaf through some of the precious books, which included works by the American archaeologist and naturalist Charles C. Abbott, and Opuscula, one of the most significant books in the Society's collection, written in Latin by Georg Bauer, better known by his Latin name, Georgius Agricola (1494-1555). He is considered the founder of geology as a discipline. Another book, Cyrillus in Johannem et Leviticum una cum thesauro eiusdem, was published in 1508. It is the only one held in Australia. There was also the first volume of Curtis' Botanical Magazine, beautifully illustrated by the artist Sydenham S. Edwards and published 1787. The Society holds the complete set (Volumes 1-14) and many other volumes of the Botanical Magazine by various other publishers.

Also on display were Lawrence Hargrave's aeronautical and other papers, together with some of the drawings and lantern-slides of his first flying machine, and a very rare edition of J.D. Dana's Geology, written when he accompanied W.B. Clarke around the Sydney Basin. Hand-written letters from Society member Charles Darwin and Louis Pasteur were a great attraction. Many of the medals on display were from Clarke's collection. These are kept in the care of the Mitchell Library together with some 20 boxes of books and other items belonging to the Royal Society. These are to be the target of our next round of assessment and listing.

Undeniably the Society is the custodian of a remarkably important collection. It is imperative that we take every step to preserve it and to make it available to all.

Guests at the soirée included the university's Vice Chancellor, Professor Gavin Brown and his wife, Diane, the former Vice Chancellor of Macquarie University, Professor Di Yerbury, acclaimed photographer-astronomer, Professor David Malin, Scientia Professor Eugenie Lumbers from the University of New South Wales, the ABC's Robyn Science Williams and many others.

The soirée was not without its formalities. President, Professor Jak Kelly set the scene, explaining the grounds for the celebration and thanking Professor Brown for so generously arranging premises for the Royal Society. He also thanked Professor Di Yerbury for her help in housing the Society and its collection at Macquarie University for some years. Councillor John Hardie spoke about the work that had been done on the collection through the funding, and outlined some of the recommendations from Peter Tyler's report. These included the urgent need to house the collection suitably so that it could be properly conserved and made available to researchers and the public alike. He also suggested that the Society should initiate a long-term project to "return Science House to Science". He reminded guests that Science House won the first Sulman Medal for Architecture for its architects, Peddle, Thorpe and Walker, in 1932.

Society member and former president, David Branagan, who was also one of the assessors, described some of the "treasures" he had examined and the insight they give to the development of our intellectual and scientific history from Colonial times. Professor Gavin Brown thanked the Royal Society for inviting him and his wife and congratulated the Society on its achievements in recent years.

It was after the formalities that guests were free to peruse the displays to the sounds of restful chamber music from the trio, Sound of Melody, adding to the already splendid atmosphere of the museum itself. David Branagan and members of the Council's grant committee were on hand to assist with enquiries and to point out items of special interest.

The evening finished on a high note when Society received an unexpected accolade from the Senior Curator of the Nicholson Museum, Michael Turner, who said that he had really enjoyed having so erudite and enthusiastic a group of people at the museum.

Robyn Stutchbury


Annual Dinner and Awards 2009

His Honour Justice James Allsop, President of the NSW Court of Appeal

The Society held a very successful Annual Dinner at the Forum Restaurant, Darlington Centre at the University of Sydney on 13 March. The Guest-of-Honour was His Honour Justice James Allsop, President of the NSW Court of Appeal who replaced our Chief Patron, the Governor-General at relatively short notice. The Society thanks His Honour for his attendance and for his very insightful Occasional Address, which touched on the relationship between the Society and the legal profession.

The other highlight of the evening was the presentation of our Awards for 2008. His Honour presented the Clarke Medal (this year it was for botany) to Professor Bradley Potts from the University of Tasmania and the Edgeworth David Medal for a young scientist to Dr Adam Micolich of the University of NSW. Associate Profesor Bill Sewell read the citations which were followed by very generous remarks by the recipients in accepting the Awards.

His Honour Justice James Allsop, the President John Hardie and Professor Bradley Potts
His Honour Justice James Allsop, the President John Hardie and Dr Adam Micolich

For further details see the March 2009 Bulletin No. 323.


Annual awards evening and dinner 2014

On Wednesday 7 May, the annual awards evening and annual dinner was held at the Union University and Schools Club in Sydney. The dinner was extremely well attended and the address by Professor Barry Jones AC FAA FACE FAHA FASSA FTSE DistFRSN on the attack on the scientific method stimulated a lot of discussion. During the evening, the Society's 2013 awards were presented and the inaugural group of eleven Fellows were presented with their certificates.

Back row: Benjamin Eggleton, Jerome Vanclay, Richard Banati, Ian Dawes, John Gascoigne. Front row: Aibing Yu, Ian Sloan, Judith Wheeldon, Donald Hector (President), Heinrich Hora, Merlin Crossley, Trevor Hambley

The President, Dr Donald Hector, presented the Society's 2013 awards. The Edgeworth David Medal was presented to Assoc Prof David Wilson, for his outstanding work on modelling HIV/AIDS and using this information to develop treatment and prevention strategies. Prof Michelle Simmons DistFRSN was awarded the Walter Burfitt Medal and Prize and Professor Brien Holden was awarded the James Cook Medal for his work in treating myopia (a leading cause of preventable blindness), particularly in developing world countries. The Clarke Medal could not be presented to distinguished geologist William Griffin, as he was overseas and unable to attend.

Left to right: Assoc Prof David Wilson, President Dr Donald Hector, Prof Brien Holden and Prof Michelle Simmons DistFRSN.

Royal Society Events

The Royal Society of NSW organizes events in Sydney and in its Branches throughout the year. 

In Sydney, these include Ordinary General Meetings (OGMs) held normally at 6.00 for 6.30 pm on the first Wednesday of the month (there is no meeting in January), in the Gallery Room at the State Library of NSW. At the OGMs, society business is conducted, new Fellows and Members are inducted, and reports from Council are given.  This is followed by a public lecture presented by an eminent expert and an optional dinner.  Drinks are served before the meeting.  There is a small charge to attend the meeting and lecture, and to cover refreshments.  The dinner is a separate charge, and must be booked in advance.  All OGMs are open to members of the public.

Since April 2020, during the COVID-19 pandemic, face-to-face meetings have been replaced by virtual meetings, conducted as Zoom webinars, allowing the events program to continue uninterrupted.  It is hoped that face-to-face meetings can be resumed in the latter half of 2021. 

The first OGM of  the year, held in February, has speakers drawn from the winners of the Royal Society Scholarships from the previous year, while the December OGM hears from the winner of the Jak Kelly award, before an informal Christmas party.  The April or May event is our black-tie Annual Dinner and Distinguished Fellow lecture.

Other events are held in collaboration with other groups, including:

  • The Four Societies lecture — with the Australian Institute of Energy, the Nuclear Panel of Engineers Australia (Sydney Division), and the Australian Nuclear Association
  • The Forum — the Australian Academy of Science, with the Australian Academy of Technology and Engineering, the Australian Academy of the Humanities, and the Academy of the Social Sciences in Australia
  • The Dirac lecture — with UNSW Sydney and the Australian Institute of Physics
  • The Liversidge Medal lecture — with the Royal Australian Chemical Institute

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.