Events - The Royal Society of NSW - Royal Society of NSW News & Events - Page 7

Royal Society of NSW News & Events

Royal Society of NSW News & Events

1225th Ordinary General Meeting

"The fourth dimension and beyond - the paradox of working in unimaginable worlds"

Emeritus Scientia Professor Ian Sloan AO FRSN

Date: Wednesday 3 September 2014

Professor Ian Sloan is not content to work in an environment of four dimensions – he is quite at home in space with many more dimensions than most of us are accustomed to. Many mathematical problems can be considered as problems and multidimensional space – the question is how do we imagine these environments? The dimensions of a space can be considered to be the number of directions that you can go from any single point within it. For example, in our four-dimensional world, from any point we can go in three spatial directions, plus time. If we are in a six-dimensional environment, we can go in six directions from any given point and mathematically we don't need more than six variables to describe this environment. But why would we be interested in multidimensional spaces?

Many problems are best analysed in multi-dimensions. For example, the shop may have 250 stock items. This can be thought of as a single point in 250-dimensional space. Each stock item has a price, so there are another 250 dimensions to consider. One area where this approach has a major application is evaluating certain types of financial transactions such as derivatives.

An investor may want to analyse a potential investment in Woolworth shares, for example. The payoff might be thought of as the closing share-price over a period of 250 trading days in, say, $10 increments. Using multidimensional mathematics, the investor can calculate the expected payoff at a certain trading day on the basis of what the closing share price might be. Such calculations soon become extremely complex, in fact, too complex to be evaluated (this is known as "the curse of dimensionality" a term coined by Richard Bellman, a noted researcher in this field). And what if the payoff can take any value, rather than being in $10 increments – does this make it even harder? Well, fortunately it does not.

Using a statistical approach known as the Monte Carlo method, these highly complex functions can be evaluated quite accurately. The Monte Carlo method may be thought of as a technique whereby one randomly throws points at a target and evaluates whether the points fall on the target or outside it. If the target can be mathematically described, the functions can be evaluated with considerable accuracy after no more than a few thousand random "throws". This enables highly-complex derivative functions to be evaluated quite accurately. The problem then lies in the assumptions underlying the mathematical model used to define the "target". Flaws in the underlying assumptions have resulted in many a lost fortune!

Despite the accuracy of the Monte Carlo method, with highly complex functions the number of random throws can become very large and the question arises, can we do better than generating the throws randomly? For some problems, using a pattern, for example, a lattice has been shown to converge much more quickly.

Multidimensional mathematics is one of our most powerful tools in solving problems from financial derivatives, to metadata analysis, to cosmology. Professor Sloan provided a particularly clear insight into a highly complex and very powerful mathematical technique.

1224th Ordinary General Meeting

"Saving Australia through science education"

Emeritus Scientia Professor Eugenie Lumbers AM DistFRSN

Wednesday, 6 August 2014

The world is experiencing an exponential rate of technological progress. Change was relatively gradual from the time and the domestication of the horse until the 17th century. Indeed, in the early stages the Industrial Revolution, industry was still heavily dependent on horse-drawn transport. In 1900, just 14 years after the invention of the motor car, there were still 300,000 horses in service in London. That same year, there were 0.11 cars per thousand people in the US; in 2009 there are 828. This enormous, rapidly accelerating technological change took place as a consequence of science and its application in development of technology. The question is why was such enthusiasm for science in the 1940s and 1950s but this has largely disappeared today in many countries, not the least of which is Australia. This poses a major challenge for Australia – how will we keep up with technological progress when few people are interested in seeking a science or technological education? Despite the apparent interest in science, in a multitude of TV programmes for example, this is actually positioning science as entertainment, not as true science.

Despite this rapid shift away from science, Australia was still doing well by international standards until the late 20th century. In 2000, Australia ranked number three in the world (after South Korea and Japan) in the OECD Programme for International Student Assessment (PISA) test, a test that measures problem-solving capability in 15-year-olds. In the latest test, in 2012, Australia ranks number 8 (after Singapore, South Korea, Japan, China, Hong Kong, Taipei and Canada). It is not surprising that Australia's ranking is slipping when only 51% take a science subject in year 12 and less than 20% studied chemistry or physics. (Interestingly, biology is somewhat higher at 25% because it is seen as being "less academic".) What will the future look hold when the technologically-educated people of today are gone? It is extraordinary that 76% of Australians do not see science directly relevant to themselves but important to Australia's future.

The Academy of sciences tried to address this through its "Primary Connections" programme and inquiry brace programme to help teachers develop their teaching programmes and to provide curriculum resources. Similarly, the Academy's "Science by Doing" programme to secondary schools is aimed at stimulating the all-important interest and enjoyment in science for children in early secondary school so that they go on to choose a career in science.

1223rd Ordinary General Meeting

"What causes MS? The impact of the genetic revolution"

Professor Graeme Stewart AM

Wednesday, 2 July 2014

Professor Graeme Stewart AM, director of clinical immunology at Westmead Hospital, has researched the genetic influences on disease, in particular on multiple sclerosis (MS). MS is the commonest chronic neurological disorder of young at all. It usually starts with a relapsing/remitting phase (symptoms occur and then go into remission for extended periods), commonly with onset at about the age of 30. The disease can be relatively benign with periods of disability, it can present as a relapsing/remitting disease with gradual increase in disability, or in about 10-20% of patients it can present as being "primary progressive", where disability progressively increases over time. MS is caused by the body's immune system malfunctioning – macrophages devour the myelin sheath around nerve cells, exposing the nerve axon and thereby disrupting the flow of information along the nerve cell. The body is able to repair the damage by re-myelinating the nerve cells after this initial attack however if the myelin is attacked the second time in the same place, the body is unable to repair the sheath and relapse occurs. Hence the symptoms of the disease progress.

The important question is: what causes this? MS is a disease which is clearly influenced by genes and environment. Studies of the disease in identical twins show 30% concordance, whereas in fraternal twins there is 4% concordance. The background incidence rate of MS is 0.4% of the population. This suggests that genetic influences are very significant but environmental factors are also a consideration. The interesting environmental effect is that the incidence of MS is quite highly correlated to latitude – for example, in Australia there is a 4 to 7 times hazard ratio between North Queensland and Tasmania – in the southern hemisphere, the further south you live, the more likely you are to contract MS. The most likely reason for this is the reduced exposure to UV-B light the further you are from the equator and vitamin D deficiency. Other environmental factors include smoking and exposure to the Epstein-Barr virus that causes glandular fever (almost all MS patients have been affected with Epstein-Barr virus). The fact that Epstein-Barr virus is implicated in virtually all MS cases may present an opportunity for treatment if the effect of this virus on DNA is understood.

Since the early 1970s, there has been a search for the genes implicated in MS. The first was found in 1972 but it was not until 2007 that the second gene was identified. Since then, as a consequence of the human genome project and widespread sequencing technology, together with the recent advances in computer power and statistical algorithms to handle large amounts of data, there have been over 100 genes identified. Pursuing genetic associations is expected to give insight into the pathogenesis, in particular the interaction between genes and environment. It is hoped that this will lead to interventions to prevent the disease from progressing. In addition, identifying genetic biomarkers may provide major opportunities for new treatments, including personalised treatments based on the individuals genetic profile.

There has been substantial progress in treatments for MS, including trials of drugs to stop T cells crossing the blood-brain barrier, drugs that capture lymphocytes and hold them in the lymph nodes and early indications that drugs targeting specific proteins identified through genetic analysis might be useful. In addition, trials are underway to see whether large doses of vitamin D might have some impact and whether increased exposure to ultraviolet light might also offer some improvement.

1222nd Ordinary General Meeting

Wednesday, 4 June 2014

"What lessons have we learnt from the Global Financial Crisis?"

Professor Robert Marks

In 2008, the world suffered "the equivalent of cardiac arrest", according to the Financial Times. It became virtually impossible for any institution to finance itself, (that is, borrow in the markets) longer than overnight. With the collapse of Lehman Bros, interbank credit markets froze and counterparty risk was considered to be too great for prospective lenders to take on the transactions. The London interbank overnight lending rate, typically in the range of 0.2% to 0.8% spiked to over 3%. This situation raises two questions: what caused this global financial crisis (GFC)? and how can we attempt to avoid similar crises in the future? The origins of the crisis go back more than 30 years.

Starting in 1977, there were substantial changes made to US investment legislation. Early in this period, the aim was to make finance more readily available to low-income borrowers, to progressively eliminate using the controls on mortgage rates and to remove discrimination in the US housing market. In 1999 and 2000, there was substantial deregulation, with substantial changes to long-standing legislation, in particular the repeal of the Glass-Steagall act of 1933 that had imposed restrictions banks during the Great Depression. There were also reforms to the Federal housing finance regulatory agencies, loosening their lending requirements.

This period of financial deregulation encouraged consolidation and demutualisation of many financial institutions that had been mutually or privately owned, with these being floated as public companies. Whereas previously their lending practices had been conservative as they had been risking their own money, now the money at risk belonged to other people! There was also great creativity in developing new financial products and instruments: Mortgage-Backed Securities (MBS), structured investment vehicles, Credit Default Swaps (CDS) and Collateral Mortgage Obligations (CMO).

In the early 2000s, the September 11 attacks, coming not long after the bursting of the "tech bubble", led to a prolonged period of low interest rates. US fiscal policy was heavily in deficit leading to massive issuance of US bonds that were largely bought by China and other Asian countries. At the same time there was further financial deregulation, relaxing capital requirements that encouraged higher gearing financial institutions.

Unsurprisingly, firms responded to the incentives put before them. The market for the new financial instruments boomed and rating agencies responded by changing the way in which they charge for their services – they began charging the firms whose products they were rating, rather than the potential buyers of the product. In the US, the financial sector grew from 3.5% of GDP in 1960 to nearly 8% of GDP in 2008.

Drawing these strands together, there were four causes of the GFC: the repeal of the Glass-Steagall act; the decision by Congress not to regulate derivatives; the relaxation of regulations that allowed banks to expand their gearing; and the change by the ratings agencies to charge the issuer rather than the buyer of rated products.

How likely is this type of situation to occur again in the near future?

Unfortunately, a number of European countries may be facing similar challenges unless they take steps to avoid the problems that the US experienced. Fortunately, Australia avoided the worst of the GFC, well-served by the "four pillars" banking policy. However, there needs to be recognition that information is asymmetric and that the issue is really not one of risk but rather of uncertainty, where there are no simple answers. As George Santayana observed in 1905, "those who cannot remember the past are condemned to repeat it".

Annual awards evening and dinner 2014

On Wednesday 7 May, the annual awards evening and annual dinner was held at the Union University and Schools Club in Sydney. The dinner was extremely well attended and the address by Professor Barry Jones AC FAA FACE FAHA FASSA FTSE DistFRSN on the attack on the scientific method stimulated a lot of discussion. During the evening, the Society's 2013 awards were presented and the inaugural group of eleven Fellows were presented with their certificates.

Back row: Benjamin Eggleton, Jerome Vanclay, Richard Banati, Ian Dawes, John Gascoigne. Front row: Aibing Yu, Ian Sloan, Judith Wheeldon, Donald Hector (President), Heinrich Hora, Merlin Crossley, Trevor Hambley

The President, Dr Donald Hector, presented the Society's 2013 awards. The Edgeworth David Medal was presented to Assoc Prof David Wilson, for his outstanding work on modelling HIV/AIDS and using this information to develop treatment and prevention strategies. Prof Michelle Simmons DistFRSN was awarded the Walter Burfitt Medal and Prize and Professor Brien Holden was awarded the James Cook Medal for his work in treating myopia (a leading cause of preventable blindness), particularly in developing world countries. The Clarke Medal could not be presented to distinguished geologist William Griffin, as he was overseas and unable to attend.

Left to right: Assoc Prof David Wilson, President Dr Donald Hector, Prof Brien Holden and Prof Michelle Simmons DistFRSN.

Distinguished Fellow's Lecture 2014

The Society was proud to have Professor Barry Jones AC DistFRSN present the second annual Distinguished Fellow's Lecture at the Society's annual dinner on Wednesday 7 May 2014. Professor Jones is the only person to be a Fellow of all four of Australia's learned Academies.

Prof Barry Jones AC DistFRSN delivers the second Royal Society of NSW Distinguished Fellow's Lecture.

1220th Ordinary General Meeting

Wednesday, 2 April 2014

"The Jameson cell"

Laureate Professor Graeme Jameson AO

At the 1220th ordinary general meeting of the Society, Laureate Professor Graeme Jameson described the development of the Jameson cell, one of the most important technological contributions to the Australian economy in the last 50 years.

The Jameson cell is a flotation cell used to concentrate the valuable components on ore in minerals processing. In a typical mining operation, the first two stages of extracting minerals are the mine itself from which the ore is recovered and the concentrator, where the valuable mineral is extracted from the rest. Generally, the valuable components are no more than 2% of the ore recovered, so there is a massive challenge in isolating this from spoil for further processing. An important technology developed to achieve this concentration step was the flotation cell, a process first developed early in the 20th century.

In a flotation technology, the ore is ground up into very fine particles and dispersed with water and surfactants in a large mixing vessel that can be kept agitated and into the bottom of which compressed air can be introduced. Reagents are added to make hydrophobic the valuable mineral particles exposed during the crushing. Air is bubbled through the suspension and the hydrophobic mineral particles attach to the bubbles, float to the surface as a froth and then are skimmed off for further processing and enrichment. Because large volumes of ore have to be treated to recover a relatively small proportion of valuable product, this is a very expensive step in recovering minerals: first, the ore has to be ground to very fine particle sizes (typically around 150 micrometres) – this takes a lot of energy; and second, the volume that has to be treated in preparing the slurry is large, so processing equipment is big and expensive. Any technology that reduces either the cost of grinding or the size of the processing equipment can have a major impact on the cost of production. The Jameson cell revolutionised the floatation process by reducing the size of the equipment needed to efficiently float off the minerals.

Over a period of several years, Professor Jamieson identified the optimum parameters for particle size and the corresponding optimum size for the air bubbles used to float the treated particles. Generally, particle size needs to be less than 150 micrometres, or, even better, less than 100 micrometres. The smaller the particle, the more likely it is to consist of the pure mineral. But the real technological breakthrough was identifying that the optimum bubble size is about 300 micrometres. Until then, conventional cells operated using bubbles about three times that size at about 1 mm diameter. Having identified the optimum bubble size, the challenge was then to design equipment that produced the right amount of sheer to generate bubbles of 300 micrometres diameter . This turned out to be relatively simple, using high pressure jets of water to entrain the air.

Much of the commercialisation work was done at Mount Isa in the 1980s and 1990s. Since then, the cell has been deployed around the world and is used routinely to extract coal, copper, lead, zinc and potash and is used in other industries such as oil-sands extraction and industrial waste treatment. The over 300 cells have been installed and the cumulative value created by this invention is more than $25 billion.

Professor Jameson was named NSW Scientist of the Year in 2013.

1219th Ordinary General Meeting

Wednesday, 5 March 2014

"Big data knowledge discovery: machine learning meets natural science"

Professor Hugh Durrant-Whyte FRS, CEO, National ICT Australia

Hugh Durrant-Whyte is an internationally-recognised expert on the analysis of "big data" – the mass of information that is being generated around current information and communication technologies. Much of this is "metadata" – data that is captured as part of some activity (for example, when a digital photograph is taken also recording camera settings, capture date etc or the data kept by telecommunication companies every time a mobile phone call is made).

2.5×1018 bytes of data are generated every day – there is immense value in mining this data but this requires sophisticated analytical techniques. "Data analytics" is the term coined for technologies to analyse this data in areas as varied as the finance industry, the health industry, planning infrastructure, failure analysis in mechanical and electronic equipment and environmental analysis, to name but a few examples. Data analytics utilises Bayesian probability theory (named after Rev Thomas Bayes, an 18th century mathematician) to prepare quantitative models of existing data, gathering new data to address remaining problems and then updating model to incorporate both the old and new data.

Data analytics can be modelled using three types of mathematical functions: discrete functions that describe, for example, events or people's actions; finite probability functions, such as signals or locations and infinite probability functions such as spatial fields or temporal fields. As the masses of data available increase, the analysis can converge on knowledge. For example, payment patterns exhibited by individuals can be aggregated to behaviours of bank branch customers, giving an understanding of consumer behaviour. On the other side of the table, customers can utilise masses of data to take advantage of the best deals available or to customise internet-based content that they may wish to buy.

Where masses of historical data are available (for example, managing water assets) readily available historical parameters can be analysed for such applications as predicting equipment failures. In the case of water asset management, pipe age, soil type etc can be analysed to give a probabilistic analysis of when a water main might fail.

The mining industry has invested large amounts of money in developing systems to utilise masses of existing information to automate mine operation. This can take all available data around the surface of the mine, the subsurface, mapping, drilling, to create a completely integrated data model into a single, real-time representation of the mine.

The purpose of National ICT Australia (NICTA) is to utilise these data analytics approaches to produce leading-edge technologies and models for such varied applications as financial modelling, creating large-scale fully integrated data maps of regions (perhaps even as large as continental Australia). There is also a particular focus on data-driven discovery in the natural sciences in applications as varied as modelling ancient plate tectonics to predict mineralisation (on a timeframe of as much is 1.5 billion years) or ecological modelling, for example, predicting the growth of trees. Ultimately, these may be able to be integrated into one massive model of the Australian continent.

Four Societies Lecture 2014

Thursday, 27 February 2014

"Questions of power in NSW"

Professor Mary O'Kane, NSW Chief Scientist and Engineer

At the annual Four Societies Lecture, Professor Mary O'Kane considered the major questions that face NSW in the future of energy production and utilisation. Asking the right questions is key – it reduces the time taken to identify the best solutions.

Australia is the ninth largest energy producer in the world and one of only four net energy exporters. We have 38% of the world's uranium, 9% of the world's coal and 2% of the world's gas. In terms of consumption, agriculture takes 3%, mining 13.5%, manufacturing and construction 25%, transportation 38% and residential about 11%. The 2014 Commonwealth Energy White Paper is seeking to address a number of questions regarding Australia's energy future. These include security of energy sources, the role of government and regulatory implications, growth and investment, trade in international relations, productivity and efficiency and alternative and emerging energy sources and technologies.

A recent report by the Grattan Institute identified a number of important issues. Australia has a lot of gas and coal, yet has yet to fully consider the impact of having no clear climate change policy. There is also the question of how can the electrical system (particularly one based on large generation units interconnected by a grid) meet the challenge of occasional very high peak demand. The Grattan Institute also posed questions around the balance of market and regulation and the importance of getting this right and explored the implications of new technologies and whether these provide potential solutions.

Australia is not unique in facing these challenges. One approach being taken in the US has been to establish an energy agency using a model was originally conceived for advanced research projects for the defence industry. ARPA-E, or the Advanced Research Projects Agency-Energy and was established to fund high risk/high reward research to identify new technologies for energy in the US. The research programmes in their portfolio relate to reconceiving the grid, the impact of micro grids, the impact of analysing big data, the gas revolution, new ways to get higher efficiencies, entirely new technologies, the best policy settings to encourage the adoption of new technologies and innovative models for research and development. Perhaps these sorts of approaches need to be utilised in NSW.

Questions that need to be addressed are what about nuclear energy? To what extent is geothermal energy applicable? How should we gain new efficiencies? How can we better optimise grid storage and geometry? What are the downsides of these various technologies? Are there opportunities to directly and export to our immediate neighbours (e.g. Indonesia)? How effective is Australia's energy R&D?

Professor O'Kane summarised the issues in three searching questions. First, how do we characterise a system that we want and the process to realise it? (What are the most important characteristics that our energy future must have, would be nice to have? What energy futures do we definitely not want?) Second, who should be responsible for demonstrating new technologies (responsible for progress, experiment, scale up, economic model and "energy equity")? And third, how can we have the best system possible? We must become expert at asking the questions and seeking solutions around the world and, importantly, developing solutions locally where appropriate in order to create a leadership position.

Joint meeting with the Australian Academy of Forensic Sciences

Thursday, 19 February 2014

"Searching for clues: Unmasking art fraud and fraudsters" - Associate Professor Robyn Sloggett

At the first joint meeting of the Society and the Australian Academy of Forensics Sciences, Professor Robyn Sloggitt explain the approach taken by forensics scientists in investigating prosecuting cultural heritage offences. The difficulty that faces authorities is determining whether or not cultural records are true and verifiable. Forensic examination used in these situations follows the Locard principle (named after Edmond Locard the pioneering French forensics scientist) that "every contact leaves a trace".

In order to determine the provenance of works of art, the forensics it seeks to establish how the object was made, what it is made of, when it was made and where it was made. But it is not attempt to determine who made it. An important foundation of provenance is establishing a body of work that can be used as a reference. Here, art and science converge. For example, in order to determine whether or not a painting might have been painted by Rembrandt, it is known that Rembrandt lived from 1608 to 1669, that certain types of pigments but not others were available in that era, canvas and other materials need to be consistent and so on. But none of these, on their own although necessary, can be sufficient to establish authenticity. As philosopher, Karl Popper put it testing cannot establish authenticity, rather it can falsify it.

When forensics science is used to determination whether there has been fraud take place a number of questions have to be answered: was the financial benefit question? was there deliberate deception? was a work that is non-authentic passed off as being authentic? Professor Sloggitt referred to a notable case in which the Australian painter and art copyist, William Blundell, painted works in the style of a number of well-known Australian artists. He referred to these works as "innuendos", stating that they were intended be used for decorative purposes only were not passed off as originals typically being sold for only a few hundred dollars and us, there was no intention to defraud.

When authentication is required, the forensics approach is accommodation of scientific analysis, gathering historical facts, attempting to verify the provenance, weighing up evidence against the item being authentic these are considered together in order to reach a determination that in the balance of probabilities as to whether or not the item is indeed authentic. This depends on the availability of good databases and a logical development of the case, with corroborative evidence and expert knowledge and opinion. The forensics process can be considered to be in two parts: investigation of primary sources; and secondary sources. Primary sources are either invasive or non-invasive, with invasive techniques including methods such as laboratory-based analysis of materials. Non-invasive methods such as spectroscopy, x-ray diffraction, electron might be, Rahman and Fourier infrared analysis in recent decades have become important tools.

Typically, the first steps are to examine documents regarding the artefact and then investigate materials, such as the frame, paints, brushstrokes, and finishing techniques. Contaminants can also be useful such as pollen, dirt, fingerprints, as are ageing characteristics and the effect of the environment. Later changes can also be important. Secondary sources are then investigated and included observations such as style and technique but these are more difficult to deal with as they are subjective and often expert opinion is divided.

Professor Sloggitt gave some insight into a number of notorious fraud cases, one being Wolfgang Beltracchi who forged over 200 works that were passed off as being pre-World War II works by famous European artists such as Max Ernst, Heinrich Campendonk and Fernand Lédger. The amount involved was $48 million over 15 years and resulted in a gaol term of six years, considered to be a rather light sentence. In Europe, art fraud cases are relatively common but in Australia are quite rare. One reason for this that there is no art fraud squad in Australia, so criminal prosecutions are rare because is no professional expertise in identifying and tracking down criminal art fraud cases and taking them to prosecution.

There was an interesting discussion after the talk that continued over the first joint dinner enjoyed by both the Society and the Academy members.

1218th Ordinary General Meeting

Presentations by Royal Society of NSW scholarship winners 2014

Date: Wednesday 5 February 2014

Venue: Union University and Schools Club, 25 Bent St, Sydney

John Chan (Pharmacology, University of Sydney)
Jessica Stanley (Chemistry, University of Sydney)
Jiangbo (Tim) Zhao (Advanced Cytometry, Macquarie University)

This presentation was delivered by A/Prof. Judith Dawes.

2014 Sydney Lecture Series

Meetings are held at various venues in Sydney (be sure to check the web-site a few days before the event for final venue details). Unless indicated, booking is not necessary. All welcome. Meetings usually commence at 6:00 pm for 6:30pm.

Entry is $5 for members of the Society and $20 for non-members to cover venue hire and a welcome drink. We often have dinner after the meeting (the cost is $75 per head). Pre-booking is appreciated.

1217th OGM and Christmas party

“Probing the nano-world with the symmetries of light”

Xavier Zambrana-Puyalto  Xavier Zambrana-Puyalto
  Department of Physics and Astronomy
  ARC Centre of Excellence for Engineered
  Quantum Systems, Macquarie University

  Winner of the RSNSW Jak Kelly Scholarship
  Award for 2013

Wednesday 18 December 2013

Union, University and Schools Club, 25 Bent Street, Sydney

The Jak Kelly Award was created in honour of Professor Jak Kelly (1928 - 2012), who was Head of Physics at University of NSW from 1985 to 1989, was made an Honorary Professor of University of Sydney in 2004, and was President of the Royal Society of NSW in 2005 and 2006. Its purpose is to encourage excellence in postgraduate research in physics. It is supported by the Royal Society of NSW and the Australian Institute of Physics, NSW branch. The winner is selected from a short list of candidates who made presentations at the most recent Australian Institute of Physics, NSW branch postgraduate awards.

In 1959, Richard Feynman gave a seminal lecture titled “There’s plenty of room at the bottom” which pushed scientists to set out on the journey of controlling light/matter interactions at the nano-scale. Since then, nanotechnology has rapidly developed. Nowadays it is inconceivable to think of any new information devices whose circuits are not in the nano-scale. Whereas nanoelectronics is a well consolidated technology producing transistors of less than 50 nm, nanophotonics has yet to overcome some drawbacks. So far, probably the most successful way of pushing light technology to the nano-scale has been plasmonics. In plasmonics, plane waves are used to excite smartly designed nano-structures to couple light with free electrons oscillations on a metallic surface and transmit information. Xavier showed that if symmetry considerations are taken into account and more elaborate beams of light are used, extra information can be retrieved from the same samples. To prove this, he presented a recent experiment carried out in his group where the complex behavior of a circular nano-metric aperture is easily predicted using symmetry considerations. The experiment deals with an old problem – the circular dichroism (CD) of a sample. CD is a widely used technique in science, and its uses range from DNA studies to protein spectroscopy. It is defined as the differential absorption of left and right circular polarization. Typically, it is established that CD can only be found in interactions with chiral structures, i.e. structures whose mirror image cannot be superimposed with them. Xavier showed that non-chiral structures, such as a circular nano-aperture, can also produce CD when light beams with cylindrical symmetry are used. This allows one to reconcile the experimental results and extend the current understanding of this phenomenon using symmetry considerations.

The Dirac Lecture 2013

“Semiconductor nanostructures and quantum phenomena”

Michael Pepper  Professor Sir Michael Pepper FRS
  Pender Chair in Nanoelectronics
  University College London

Dirac Lecture and Medal Presentation

Professor Pepper was also presented with an Honorary Degree from UNSW after his lecture.

Thursday 21 November 2013

Law Theatre, UNSW

Industry innovation has developed a combination of electron beam lithography and advanced semiconductor growth. This has stimulated interest in discovering more about the basic properties of semiconductor nanostructures.

The Lecture showed how advanced semiconductor growth technology, which was developed for the information technology industry, has allowed the creation of new types of structures for investigating the quantum aspects of electron transport. It also showed how the dimensionality which is experienced by the electrons can be reduced from 3 to 2 to 1 and then to 0.

History of the Dirac Lecture:

The Dirac Medal for the Advancement of Theoretical Physics is awarded by the UNSW and the Australian Institute of Physics. The Lecture and the Medal commemorate the visit to the university in 1975 of Professor Dirac, who gave five lectures. The lectures were subsequently published as a book Directions of Physics. Professor Dirac donated the royalties from this book to the University for the establishment of the Dirac Lecture and Prize. The prize includes a silver medal and honorarium. It was first awarded in 1979.

1216th OGM and public lecture

“Re-thinking science education in Australian schools: development and implementation of the National Science Curriculum”

Mark Butler  Dr Mark Butler

  Department of Education and Communities

Wednesday 6 November 2013

Union, University and Schools Club, 25 Bent Street, Sydney

Dr Butler examined the development and nature of the new national senior high school science curriculum. In 2008 the Federal Government secured agreement with all state and territory governments to develop a national F-12 school curriculum. Responsibility for developing the curriculum was assigned to the newly established, Australian Curriculum, Assessment and Reporting Authority (ACARA). The national F-10 Science curriculum was completed in 2011 and will be implemented in NSW schools from 2014.

In December 2012 the curricula for senior courses in Physics, Chemistry, Biology and Earth and Environmental Science were completed and signed off by the state and territory governments. Provided the newly elected Federal Government continues to support the new curriculum, the national senior science courses will be introduced in NSW schools in 2016.

The senior science curriculum was developed to reflect international best practice in science education. The courses were designed to cater for students who wished to pursue further study in science and for those who would not continue to study science beyond school level. But in spite of two extensive rounds of public consultation and over two years of refinement, the national senior science curriculum remains controversial and the content chosen, and the three strands (Science as Human Endeavour, Science Inquiry Skills and, Knowledge and Understanding) used to present it, continue to cause some concern. While the new courses will undoubtedly address the issues of comparability and consistency, only time will tell if the new courses will attract more students to study science and/or more effectively prepare students for studying science at tertiary level.

Dr Mark Butler is currently Head Teacher of Science at Gosford High School and the National Education Convener of the Australian Institute of Physics. He has taught science in secondary schools in NSW and has been an active member of the professional science education community for over thirty years. Dr Butler is particularly interested in developing strategies to encourage more students to study science in senior high school and at tertiary level.

1215th Ordinary General Meeting

Wednesday, 2 October 2013

"Astrobiology: the latest from Curiosity" - Professor Malcolm Walter

"Seven seconds of terror" was how the operators at the Jet Propulsion Laboratory in the US describe the landing of 'Curiosity', the latest rover mission that landed on Mars in August last year. In the last stage of the landing, the entry vehicle hovered about 80 m above the surface of Mars and lowered Curiosity (which weighs nearly a tonne) by cranes to a gentle touch-down. Given that it can take up to 20 minutes for signals to reach Mars (or up to a 40 minute round-trip) there is a significant delay that constrains the Earth-based control station.

The purpose of the Curiosity mission is to understand the geological and biological context to determine whether life may have existed or, indeed, still exist on Mars. Mars is somewhat smaller than the Earth with the surface area of Mars being about the same as the exposed surface area of the Earth's continents. Until as recently as 60 years ago, up it was thought that advanced life may have once existed on Mars and could have been responsible for the canals and other geological phenomena that have been observed through telescopes. It is now thought that the most advanced form of life to be possible on Mars would be single cell organisms, probably similar to those that existed on earth in the early stages of life. To put this in perspective life first appeared on earth about 3,500 million years ago and, until about 500 million years ago, consisted entirely of single cell organisms. Nearly all of the diversity of life on earth is microscopic, so it makes sense to look for this as the first signs of life in other places in the universe.

One way to understand what early life might look like is to examine geological formations in very old rocks, such as the 3,500 million-year-old rocks in the Pilbara. Fortunately, these rocks are of great interest to geologists because they often hold valuable mineral deposits, so quite a lot is known about them. They are known to have been formed by volcanic action, so a second, complimentary approach is to see what forms of life exist in active volcanoes. One such volcano is White Island in New Zealand. Single cell life forms have been found there in water up to 123°C, so it is now known that life can exist from about -30°C to over 120°C.

In order to try to understand the evolutionary context of these single cell organisms, biologists look at bio-markers in the geological samples that are characteristic of life and see how these evolve. This is analogous to looking at skeleton evolution in more advanced life forms. Already, a great deal has been learned about the geological environment on Mars. An early mission, Phoenix, found ice at northern latitudes. The channels suggest that there was flowing liquid at one point in Mars' geological history. That was almost certainly water. Imaging shows that there is still channel formation taking place on the surface of Mars now which suggests that at times at least there is fluid flow. It is too cold for pure water, so if indeed this turns out to be due to rivers, they would have to be highly saline to be liquid at these temperatures.

Earlier investigations suggested that there was methane in the Martian atmosphere, however Curiosity has found none. The earlier observations are now thought to be due to a C-13 isotope of methane in the earth's atmosphere.

Curiosity is an extremely expensive mission – it takes 265 people every day to keep it running but the contribution to our understanding of Mars and the origins of the solar system and, by implication other phenomena in the universe is enormous. There are a further 15 missions planned by various public and private agencies over the next decade or so.

1214th Ordinary General Meeting

Wednesday, 4 September 2013

"Open science" - Dr Matthew Todd

The speaker at the Society's 1214th ordinary general meeting was Dr Matthew Todd, a Senior Lecturer in Chemistry at the University of Sydney who is a leading proponent of the concept of "open science".

Dr Todd began with an example of the type of problem to which open science can provide a very practical solution. In Africa and parts of South America and Asia, the parasitic disease schistosomiasis (also known as bilharzia or snail fever) is endemic. Schistosomiasis is caused by infection by water-borne parasites that penetrate the skin and enter the bloodstream. Although the mortality rate is low, schistosomiasis is a serious chronic illness. It is particularly devastating to children – it damages internal organs, impairs growth and causes cognitive impairment. After malaria, it is the most socio-economically devastating disease in the world.

Schistosomiasis can be treated by a drug called praziquantel that is inexpensive and is administered orally. The problem is that praziquantel tablets are very bitter to the taste and, consequently, many people do not complete the course of treatment. But praziquantel is an organic molecule that exists as two stereoisomers (stereoisomers are molecules that exist in two forms, one being the mirror-image of the other in much the same way as the is the left hand is the mirror-image of the right hand). Often in pharmacology, only one of the stereoisomers has the desired physiological effect and, indeed, this is the case with praziquantel. The "R" stereoisomer kills the parasite and does not have an unpleasant taste. The "S" stereoisomer is inactive and, fortuitously, is entirely responsible for the bitter taste. So why not simply make the R-form? Unfortunately, both forms are produced together in the reactions which are commonly used for synthesising this drug and are not easily separated in the manufacturing process. The best solution is to find catalysts and reaction conditions that favour the production of the desired stereoisomer over the other. However, there is no public funding available for the research and private enterprise will not fund it because the drug is so cheap that the financial return too low.

Another problem is that the normal research paradigm is sequential: a research grant is awarded; the work is done; the results are published and if encouraging, will perhaps result in further research grant. This can be dreadfully slow and a far more efficient way of solving complex problems of this nature is to have collaborative research that can proceed concurrently rather than sequentially - parallel rather than serial processing, as it were. There are number of examples of this type of collaboration being successful in areas such as astronomy, mathematics and biology. Dr Todd and his group at Sydney University explored using the open science approach to develop the manufacturing approach for the active, tasteless R-stereoisomer of praziquantel.

This approach resulted in rapid progress through collaboration of groups around the world, with at least two routes identified as potential practical manufacturing steps.

Dr Todd argues that the whole process of science is based on openness, the sharing of results and collaboration. Issues around patterns can be important but many of the key discoveries of the last century or so have not been subject to patent protection.

The Poggendorf Lecture 2013

Tuesday, 13 August 2013

"Biodiversity and the future of agriculture" - Professor Geoff Gurr

After a hiatus of 20 years, the Poggendorf Lecture was delivered in conjunction with Charles Sturt University, Orange, on Tuesday, 13 August 2013. The lecture was delivered by Professor Geoff Gurr, a biologist and entomologist and Professor of Applied Ecology at Charles Sturt University, where he specialises in the utilisation of natural solutions to control agricultural pests to partially or completely replace synthetic pesticides.

The population of the world is increasing by 170,000 souls per day. Currently, 40% of land is used for some agricultural purpose and the demand for agricultural products is expected to increase not only as a consequence of population growth but by the increasing living standards of people in the developing world. For example, the growth in meat demand is very strong and it takes 10 kg of grain to produce 1 kg of animal protein. This leads to the conclusion that food production needs to double by 2050. The so-called "green revolution" of the last few decades has enabled the increase in food production to largely match population growth, largely through the application of nitrogen, phosphorus, some trace elements, water and the wide-scale use of pesticides. But was this revolution truly "green"? Human inputs are largely non-renewable but, importantly, do not actually address the root cause of the problem – pest outbreaks are not due to a lack of pesticide, they are due to other imbalances in the environment. So the world is faced with a "wicked problem" of seeking food security, having finite renewable resources, a declining availability of agricultural land, changing climate and a moral obligation to preserve biodiversity (human activity, including agriculture, causes biodiversity loss at a rate about 10,000 times greater than the background rate).

Sustainable agricultural practices that are emerging can be considered in three areas: genetic (utilising the natural defence mechanisms identified in certain species and transferring these to other species); species (utilising the natural enemies of pests in order to control population); and ecosystems (developing landscapes that have high biodiversity that tends to equilibrate around sustainable species populations).

The thrust of Professor Gurr's work is that by integrating diverse approaches, including biological, cultural and chemical controls, hazards to humans and the environment can be minimised and, in many cases, productivity of agricultural systems can be improved. The principle underlying this is the acknowledgement that agricultural landscapes benefit from biodiversity and that this has significant benefit in terms of ecosystem services such as pollination of crops, reducing erosion, reducing contamination of water courses with excess nutrients and biological control of crop pests.

Generally, the greater the biological diversity, the fewer the pests. This is because the natural activity of predators, parasites and pathogens maintain potential pests' population densities at a lower level than would occur in their absence. In the case of monocultures, this balance is often upset, enabling the density of pests to get to plague proportions. The widely accepted agricultural response to this is to use synthetic pesticides which often exacerbate the problem by further reducing biological diversity. In turn, the levels of artificial agents required to control pests increases with the consequent damage to the environment.

Professor Gurr described an example in China where rice production was being severely affected by a particular species of plant hopper. This species had evolved resistance to insecticides and was substantially reducing rice yield. Professor Gurr's group investigated the use of bund walls used to retain water in rice fields to plant vegetation selected because it was a host to predators for this species as plant hopper. They also introduced another species of plant hopper that did not affect rice yield and attacked the pest species. In addition, they planted species of flowers that attracted parasitic wasps that attacked the pest species. The result was a substantial reduction in the pest species, leading to significantly increased rice field, with secondary benefits, for example increase in the frog population.

There is a common misconception that this type of biological control can have negative impact on yield but a meta-analysis of 286 projects demonstrated an average 80% increase in yield. The "green" approach to pest management potentially could double food production in 10 years: the challenge is to identify the value of ecosystem services and how to utilise them.

Historically, agricultural science has focused on agricultural production and environmental science has focused on protecting the environment – these have coexisted almost as separate disciplines. If food security is to be accomplished in the next few decades, there needs to be an integration of agricultural and environmental protection practices. China has been very active in this. 24% of agricultural land in China has been allocated some form of conservation status. Similarly in Europe, there is a trend towards farmers being encouraged to consider themselves as stewards of the land, rather than owners.

Regrettably, Australia is not leading the way in this area. Nonetheless, there are examples of this type of approach such as "alley farming" that provide shelter for natural species and encourages biological diversity thereby reducing significantly the requirement for synthetic pesticides.

Professor Gurr concluded by observing that the world cannot double food production with the current agricultural practices – they are simply unsustainable. If we learn to value ecosystem services, in particular recognising the importance of biodiversity, doubling food production, a requirement to feed the projected world population is both achievable and potentially beneficial to the global ecosystem.

1213th Ordinary General Meeting

Wednesday, 7 August 2013

"How numbers came to rule the world: the impact of Luca Pacioli,
Leonardo da Vinci and the merchants of Venice on Wall Street" - Jane
Gleeson-White

At the 1213th meeting of the Society at the Powerhouse Museum on Wednesday, 7 August 2013, Jane Gleeson-White outlined the argument she presented in her best-selling book Double Entry, the history of the impact of double-entry accounting on the development the capitalist model that has shaped Western civilisation.

Until the 13th century, the prevailing arithmetic system used in Europe was the Roman system which largely precluded complex operation such as multiple cache on and vision. During the Renaissance, the Hindu-Arabic number system and algebra was introduced. One major figure in this was Luca Bartolomeo de Pacioli, a Renaissance monk and mathematician, a colleague of Piero della Francesca and Leonardo da Vinci.

Pacioli wrote a number of major texts on mathematics and was one of the great influences on the development of maths during the Renaissance. He lived for a time in Venice and the merchants there were quick to introduce his system of double-entry book-keeping to record their mercantile transactions. (The double-entry system requires there to be two accounts for every transaction: one a credit account, the other debit account. For every creditor there must be a debtor; and for every debtor there must be a creditor.)

Although merchants had recorded their transactions from Phoenician times, these records were largely narrative in nature. The merchants of Venice were able to abstract and summarise financial performance into a single accounting system that was independent of the goods being transacted. Over the next couple for centuries the double-entry bookkeeping system was adopted first throughout Europe and into the rest of the world.

Gleeson-White argues that this innovation was fundamental to the development of capitalism and the consumer-oriented economic system that prevails worldwide today. It led to the system of national accounts that is used by governments that distils all human activity into a single number: gross domestic product or GDP. She further argues that double-entry book-keeping was a major influence on the scientific revolution and that together these led to the industrialisation of the world and the unsustainable stress that it is currently facing. These claims are not uncontentious and there was a lively discussion after the talk.

Jane's talk was broadcast by the ABC on Radio National's Big Ideas on Tuesday 3 September 2013. Click 1213th OGM to download the RN broadcast.

1212th Ordinary General Meeting

Wednesday 3 July 2013

"Caring for highly processed wood pulp? The role of the State library in the 21st century" - Dr Alex Byrne

At the 1212th ordinary general meeting the Society on Wednesday, 3 July 2013, we were delighted to welcome Dr Alex Byrne, State Librarian and Chief Executive of the State Library of NSW. Dr Byrne gave a wide-ranging talk about the State Library and the extraordinarily valuable collection that it holds.

The State of NSW is fortunate to have perhaps the most important collection in Australia. There is no other state library that is its equal and the only Australian library that might come close is the National Library in Canberra. The State library is a library of deposit (meaning that there is a legal requirement for every printed publication produced in State of NSW to lodge a copy with a library. There are two other libraries of deposit in NSW – the Parliamentary Library and Fisher Library at the University of Sydney). The collection that the Library houses extends to 138 linear kilometres of shelf-space and this is being added to at a rate of 2 linear km per year. The collection represents one of the major assets of the State of NSW and is valued at $2.1 billion.

Examples of important items that the Library holds are the stern-plate of HMS Resolution (James Cook's ship on his second and ill-fated third voyages) and Cook's ammunition belt. There is an extensive World War I collection and of particular importance are personal diaries kept by soldiers. Many soldiers kept these small, notebook-size dairies and they give deep insight into the personal experiences of the writers. There is even one diary that was written by an Australian General, despite these being strictly against regulations.

The collection is diverse and is not restricted to printed materials. There are many important paintings, the entire collection from the Packer Press of newspaper photographs (over 350,000 images) and a wide variety of other artefacts that give the enormous insight into the cultural narrative that has unfolded over the last 200 years or so (the Library started as the Australian Subscription Library in 1826).

Unfortunately, much of the collection is on media that does not last well. For example wood-pulp paper and many of the digital media of the last 30 or 40 years start deteriorating within 20-30 years. Currently, the most practical solution to this problem is to digitise the collection and the Library has been fortunate to receive a government grant of $32.6 million over the next four years to renew the digitisation infrastructure, with a further $32 million over the subsequent six years to commence digitisation of the collection. Even with this substantial sum of over $60 million to be spent over 10 years only about 6% of the collection will be converted into searchable, digital form.

The Library also houses a substantial collection on behalf of the Royal Society of NSW and we intend to work with State library to make this important collection more accessible.

Royal Society events

The Royal Society of NSW organizes a number of events in Sydney throughout the year.  These include Ordinary General Meetings (OGMs) held on the first Wednesday of the month (there is no meeting in January).  Society business is conducted, new Fellows and Members are inducted, and reports from Council are given to the membership.  This is followed by a talk and optional dinner.  Drinks are served before the meeting.  There is a small charge to attend the meeting and talk, and to cover refreshments.  The dinner is a separate charge, and must be booked in advance.  All OGMs are open to members of the public.

The first OGM in February has speakers drawn from the Royal Society Scholarship winners, and the December OGM hears from the winner of the Jak Kelly award, before an informal Christmas party.  The April or May event is our black-tie Annual Dinner and Distinguished Fellow lecture.

Other events are held in collaboration with other groups, including:

  • The Four Societies lecture (with the Australian Institute of Energy, the Nuclear Panel of Engineers Australia [Sydney Division] and the Australian Nuclear Association)
  • The Forum (with the Australian Academy of Technology and Engineering, the Australian Academy of Science, the Australian Academy of the Humanities and the Academy of the Social Sciences in Australia)
  • The Dirac lecture (with UNSW Australia and the Australian Institute of Physics)
  • The Liversidge Medal lecture (with the Royal Australian Chemical Institute)
Site by ezerus.com.au

Privacy policy  |  Links to other societies

All rights reserved; copyright © The Royal Society of NSW.