Royal Society of NSW News & Events

Royal Society of NSW News & Events

1207th Ordinary General Meeting

Wednesday 6 February 2013

Presentations by Royal Society of NSW scholarship winners 2013

The 2012 Scholarship winners presented at the first meeting of 2013 held at the Union University and Schools Club on Wednesday 6 February.

Helen Smith (left) is completing her PhD at Sydney University as part of a Sydney-based conservation programme to reintroduce the native bush rat into the Sydney Harbour National Park. If successful, this promises to be an effective way of displacing introduced rats that have had significant impact on local wildlife. Initial indications suggest that, once established, native rats successfully compete with introduced rats.

Anwen Krause-Heuer (right) is in the midst of a PhD at the University of Western Sydney and is working on the development of new cancer drugs based on cis-platin. The aim of the workers to develop platinum-based anti-cancer complexes that have lower toxicity than established treatments.

Jendi Kepple is undertaking a PhD at the University of NSW and is investigating the design of various alloys and composite materials to improve the design of launch vehicles used in the European space programme. (Unfortunately Jendi was not able to attend evening as she was at a conference overseas. She was well represented by one of her colleagues.)


1208th Ordinary General Meeting

Wednesday, 6 March 2013

"The evolution of galaxies" - Dr Ray Norris

Ray Norris, a senior astrophysicist with the CSIRO spoke at the 1208th OGM of the Society on one of the Australian Square Kilometre Array Pathfinder (ASKAP) projects, Project EMU – an acronym for evolutionary map of the universe.

The ASKAP project is the first phase of the $2 billion Square Kilometre Array project shared between South Africa and Australia. The cost of this phase is $170 million and is being built in Western Australia.

It consists of 36 12-m radio antennas that have extraordinarily high resolution, using devices called phased-array feeds. Project EMU is one of two high-priority projects that are currently underway. Emu will conduct a deep survey of a patch of dark sky, making deep images at several different wavelengths to create a census of all galaxies within the patch being examined.

The aim is to identify the different evolutionary tracks of galaxies and, hopefully, to identify some important but rare transitional stages. The survey is expected to be able to look back in time to the formation of the first stars around 400 million years after the big bang that took place 13.7 billion years ago. Radio telescopes are ideal for this type of survey because they are unaffected by dust. When combined with infrared and optical data, they can give a very powerful image of their field of view.

Dr Norris outlined many of the phenomena that EMU is investigating. The science goals of the EMU project are to better understand the evolution of massive black holes, to explore the large-scale structure and cosmological parameters of the universe (for example, test theories about dark energy) and to explore diffuse low-surface-brightness radio objects. The project will also add substantially to a large database of surveys that can be mined as computing capacity continues to increase.


1209th Ordinary General Meeting

Wednesday, 3 April 2013

Inaugural Fellows Lecture - Professor Michael Archer AM

"An evolutionary history of Australia"

The Society was proud to have Professor Michael Archer AM present the inaugural Fellows Lecture on Wednesday, 3 April 2013. Professor Archer was one of the first Fellows appointed by the Society, recognising his outstanding work as a palaeontologist, particularly in relation to the Riversleigh fossil find in Queensland, one of the richest fossil deposits in the world.

Until about 50 years ago, only about 70 fossil mammals had been found in the whole Australian continent, compared to about 50,000 in North America. The geology of the Riversleigh area, in northern Queensland, is unusual. There are large expanses of very old (1.6 billion years) Precambrian rock and more recent Cambrian deposits (500 million years old) that contain rather unremarkable fossils of the era. But there are pockets of more recent geological deposits, 10-25 million years old, that have been found to contain extraordinarily well-preserved fossils. There are about 40 sq. km of these deposits. A wide range of unusual animals have been found: five kinds of thylacine, giant, toothed platypus, flesh-eating kangaroos and ancient birds. Some of the birds are the biggest ever discovered and would have weighed up to 400 kg. Also, huge fossilised snakes, importantly, a diverse range of ancient bats and a great variety of trees and plants have been discovered.

How did this extraordinary preservation take place? Professor Archer explained that there were two phenomena that together resulted in this remarkable deposit. Water that percolated up from subterranean deposits were saturated in calcium carbonate and this quickly precipitated around any dead animals that fell into the water. This was responsible for preserving skeletons intact and is easily removed using weak acid such as acetic acid that quickly dissolve the calcium carbonate, exposing a well-preserved fossilised skeleton. But in addition, another phenomenon called 'bacterially-mediated phosphatisation', means phosphates from bat droppings have preserved soft tissue, resulting in remarkably complete fossils being found in many areas. In a process known as 'tufagenic barrage', calcium carbonate deposits formed dams that allowed fossilisation to take place. These dams were ultimately breached but the fossils were preserved. At the time, Riversleigh area was covered with rainforest but this has gradually receded to coastal zones.

The Riversleigh deposits cover five phases from 25 million years ago to 1.5 million years ago and is the richest sequence in Australia. (There is only one other similar deposit in the world – this is in France.) The Riversleigh find has completely changed perceptions about Australia's past. It is now clear that there is a diversity in the fossil record suggesting an environment that was as rich at the time as Borneo and the Amazon regions are today. About 15 million years ago Australia started to dry out, yet it was not until about 3 million years ago that extensive grasslands formed.

Professor Archer pointed out that the fossil record gives us a very rich understanding of the way in which current species have evolved from which we can deduce how habitat change can be managed and to protect species that might be at risk of extinction as climate change takes place. We can also gain insight into which species are at threat by understanding the extent to which their populations have increased or declined over long periods of time.


1210th Ordinary General Meeting

Wednesday, 1 May 2013

"In an analogue world, envisioning the digital future: Paul Otlet, a
forgotten forefather of today's 'information society' " - Emeritus
Professor Boyd Rayward

Emeritus Professor Boyd Rayward gave a fascinating talk about someone whom he styled as 'forgotten', but who in reality had never been heard of by most members of the audience, the Belgian Paul Otlet (1868 – 1944). A lawyer by profession, an activist for peace in the very troubled times of Europe in the first half of the 20th century, Otlet had the revolutionary idea of collating and indexing all knowledge in a way that could be augmented, updated and proliferated world wide. As new technology came along (telegraph, telephone, radio, etc.) he embraced each into his universal knowledge network.

In 1910, Otlet and Henri La Fontaine first envisioned a "city of knowledge", which Otlet originally named the "Palais Mondial" ("World Palace", later called the Mundaneum), that would serve as a central repository for the information, and "radiate knowledge to the rest of the world". The many world cities that were designed, were mostly never built, and Otlet's own offices were closed down by the Belgian Government in 1934.

His well-known (until recently) legacy, was the invention of the 3 x 5 inch standard index card, found in every library until the modern computer era. There were 15 million of them in the Mundaneum before it closed in 1934. There are just over 30.2 million pages in Wikipedia as of 21 May, 2013.

A museum was opened in 1988 in Mons, Belgium as a kind of recreation of the Mundaneum and repository of the papers of Otlet (and La Fontaine). Professor Rayward divides his time between Belgium, Illinois (where he is professor emeritus) and Sydney, and continues to research into the life of this amazing and far-sighted man of the world.


Royal Society of NSW Forum 2013

Thursday, 6 June 2013

The Royal Society of NSW Forum 2013 was held at the Powerhouse Museum on Thursday 6 June before a large audience. Antony Funnell of the ABC's Radio National moderated the discussion between:

  • Professor Brian Schmidt AC FRSN, Nobel Prize winner
  • Professor Steven Schwartz AM, former Macquarie University Vice Chancellor
  • Ms Judith Wheeldon AM, former Principal of both Queenwood School for Girls and Abbotsleigh
  • Professor Merlin Crossley, Dean of Science at the University of NSW

Among other questions, our panellists discussed: will a falling focus on science and technology in education really be a problem for innovation in Australia? Is it a matter of basic education? Is it poor teaching? Is there a fundamental aversion to maths and science in Australia? Given our reliance on technology, why is there not a greater desire to utilise it and to develop it? Is there a "science literacy" problem in Australia? Why have we become passive about science and technology, rather than embracing it at its fundamental levels?

In case you missed it, it was broadcast on ABC Radio National Big Ideas on Monday 17 June (click Forum 2013 to download a recording of the broadcast).


1212th Ordinary General Meeting

Wednesday 3 July 2013

"Caring for highly processed wood pulp? The role of the State library in the 21st century" - Dr Alex Byrne

At the 1212th ordinary general meeting the Society on Wednesday, 3 July 2013, we were delighted to welcome Dr Alex Byrne, State Librarian and Chief Executive of the State Library of NSW. Dr Byrne gave a wide-ranging talk about the State Library and the extraordinarily valuable collection that it holds.

The State of NSW is fortunate to have perhaps the most important collection in Australia. There is no other state library that is its equal and the only Australian library that might come close is the National Library in Canberra. The State library is a library of deposit (meaning that there is a legal requirement for every printed publication produced in State of NSW to lodge a copy with a library. There are two other libraries of deposit in NSW – the Parliamentary Library and Fisher Library at the University of Sydney). The collection that the Library houses extends to 138 linear kilometres of shelf-space and this is being added to at a rate of 2 linear km per year. The collection represents one of the major assets of the State of NSW and is valued at $2.1 billion.

Examples of important items that the Library holds are the stern-plate of HMS Resolution (James Cook's ship on his second and ill-fated third voyages) and Cook's ammunition belt. There is an extensive World War I collection and of particular importance are personal diaries kept by soldiers. Many soldiers kept these small, notebook-size dairies and they give deep insight into the personal experiences of the writers. There is even one diary that was written by an Australian General, despite these being strictly against regulations.

The collection is diverse and is not restricted to printed materials. There are many important paintings, the entire collection from the Packer Press of newspaper photographs (over 350,000 images) and a wide variety of other artefacts that give the enormous insight into the cultural narrative that has unfolded over the last 200 years or so (the Library started as the Australian Subscription Library in 1826).

Unfortunately, much of the collection is on media that does not last well. For example wood-pulp paper and many of the digital media of the last 30 or 40 years start deteriorating within 20-30 years. Currently, the most practical solution to this problem is to digitise the collection and the Library has been fortunate to receive a government grant of $32.6 million over the next four years to renew the digitisation infrastructure, with a further $32 million over the subsequent six years to commence digitisation of the collection. Even with this substantial sum of over $60 million to be spent over 10 years only about 6% of the collection will be converted into searchable, digital form.

The Library also houses a substantial collection on behalf of the Royal Society of NSW and we intend to work with State library to make this important collection more accessible.


1213th Ordinary General Meeting

Wednesday, 7 August 2013

"How numbers came to rule the world: the impact of Luca Pacioli,
Leonardo da Vinci and the merchants of Venice on Wall Street" - Jane

At the 1213th meeting of the Society at the Powerhouse Museum on Wednesday, 7 August 2013, Jane Gleeson-White outlined the argument she presented in her best-selling book Double Entry, the history of the impact of double-entry accounting on the development the capitalist model that has shaped Western civilisation.

Until the 13th century, the prevailing arithmetic system used in Europe was the Roman system which largely precluded complex operation such as multiple cache on and vision. During the Renaissance, the Hindu-Arabic number system and algebra was introduced. One major figure in this was Luca Bartolomeo de Pacioli, a Renaissance monk and mathematician, a colleague of Piero della Francesca and Leonardo da Vinci.

Pacioli wrote a number of major texts on mathematics and was one of the great influences on the development of maths during the Renaissance. He lived for a time in Venice and the merchants there were quick to introduce his system of double-entry book-keeping to record their mercantile transactions. (The double-entry system requires there to be two accounts for every transaction: one a credit account, the other debit account. For every creditor there must be a debtor; and for every debtor there must be a creditor.)

Although merchants had recorded their transactions from Phoenician times, these records were largely narrative in nature. The merchants of Venice were able to abstract and summarise financial performance into a single accounting system that was independent of the goods being transacted. Over the next couple for centuries the double-entry bookkeeping system was adopted first throughout Europe and into the rest of the world.

Gleeson-White argues that this innovation was fundamental to the development of capitalism and the consumer-oriented economic system that prevails worldwide today. It led to the system of national accounts that is used by governments that distils all human activity into a single number: gross domestic product or GDP. She further argues that double-entry book-keeping was a major influence on the scientific revolution and that together these led to the industrialisation of the world and the unsustainable stress that it is currently facing. These claims are not uncontentious and there was a lively discussion after the talk.

Jane's talk was broadcast by the ABC on Radio National's Big Ideas on Tuesday 3 September 2013. Click 1213th OGM to download the RN broadcast.


The Poggendorf Lecture 2013

Tuesday, 13 August 2013

"Biodiversity and the future of agriculture" - Professor Geoff Gurr

After a hiatus of 20 years, the Poggendorf Lecture was delivered in conjunction with Charles Sturt University, Orange, on Tuesday, 13 August 2013. The lecture was delivered by Professor Geoff Gurr, a biologist and entomologist and Professor of Applied Ecology at Charles Sturt University, where he specialises in the utilisation of natural solutions to control agricultural pests to partially or completely replace synthetic pesticides.

The population of the world is increasing by 170,000 souls per day. Currently, 40% of land is used for some agricultural purpose and the demand for agricultural products is expected to increase not only as a consequence of population growth but by the increasing living standards of people in the developing world. For example, the growth in meat demand is very strong and it takes 10 kg of grain to produce 1 kg of animal protein. This leads to the conclusion that food production needs to double by 2050. The so-called "green revolution" of the last few decades has enabled the increase in food production to largely match population growth, largely through the application of nitrogen, phosphorus, some trace elements, water and the wide-scale use of pesticides. But was this revolution truly "green"? Human inputs are largely non-renewable but, importantly, do not actually address the root cause of the problem – pest outbreaks are not due to a lack of pesticide, they are due to other imbalances in the environment. So the world is faced with a "wicked problem" of seeking food security, having finite renewable resources, a declining availability of agricultural land, changing climate and a moral obligation to preserve biodiversity (human activity, including agriculture, causes biodiversity loss at a rate about 10,000 times greater than the background rate).

Sustainable agricultural practices that are emerging can be considered in three areas: genetic (utilising the natural defence mechanisms identified in certain species and transferring these to other species); species (utilising the natural enemies of pests in order to control population); and ecosystems (developing landscapes that have high biodiversity that tends to equilibrate around sustainable species populations).

The thrust of Professor Gurr's work is that by integrating diverse approaches, including biological, cultural and chemical controls, hazards to humans and the environment can be minimised and, in many cases, productivity of agricultural systems can be improved. The principle underlying this is the acknowledgement that agricultural landscapes benefit from biodiversity and that this has significant benefit in terms of ecosystem services such as pollination of crops, reducing erosion, reducing contamination of water courses with excess nutrients and biological control of crop pests.

Generally, the greater the biological diversity, the fewer the pests. This is because the natural activity of predators, parasites and pathogens maintain potential pests' population densities at a lower level than would occur in their absence. In the case of monocultures, this balance is often upset, enabling the density of pests to get to plague proportions. The widely accepted agricultural response to this is to use synthetic pesticides which often exacerbate the problem by further reducing biological diversity. In turn, the levels of artificial agents required to control pests increases with the consequent damage to the environment.

Professor Gurr described an example in China where rice production was being severely affected by a particular species of plant hopper. This species had evolved resistance to insecticides and was substantially reducing rice yield. Professor Gurr's group investigated the use of bund walls used to retain water in rice fields to plant vegetation selected because it was a host to predators for this species as plant hopper. They also introduced another species of plant hopper that did not affect rice yield and attacked the pest species. In addition, they planted species of flowers that attracted parasitic wasps that attacked the pest species. The result was a substantial reduction in the pest species, leading to significantly increased rice field, with secondary benefits, for example increase in the frog population.

There is a common misconception that this type of biological control can have negative impact on yield but a meta-analysis of 286 projects demonstrated an average 80% increase in yield. The "green" approach to pest management potentially could double food production in 10 years: the challenge is to identify the value of ecosystem services and how to utilise them.

Historically, agricultural science has focused on agricultural production and environmental science has focused on protecting the environment – these have coexisted almost as separate disciplines. If food security is to be accomplished in the next few decades, there needs to be an integration of agricultural and environmental protection practices. China has been very active in this. 24% of agricultural land in China has been allocated some form of conservation status. Similarly in Europe, there is a trend towards farmers being encouraged to consider themselves as stewards of the land, rather than owners.

Regrettably, Australia is not leading the way in this area. Nonetheless, there are examples of this type of approach such as "alley farming" that provide shelter for natural species and encourages biological diversity thereby reducing significantly the requirement for synthetic pesticides.

Professor Gurr concluded by observing that the world cannot double food production with the current agricultural practices – they are simply unsustainable. If we learn to value ecosystem services, in particular recognising the importance of biodiversity, doubling food production, a requirement to feed the projected world population is both achievable and potentially beneficial to the global ecosystem.


1214th Ordinary General Meeting

Wednesday, 4 September 2013

"Open science" - Dr Matthew Todd

The speaker at the Society's 1214th ordinary general meeting was Dr Matthew Todd, a Senior Lecturer in Chemistry at the University of Sydney who is a leading proponent of the concept of "open science".

Dr Todd began with an example of the type of problem to which open science can provide a very practical solution. In Africa and parts of South America and Asia, the parasitic disease schistosomiasis (also known as bilharzia or snail fever) is endemic. Schistosomiasis is caused by infection by water-borne parasites that penetrate the skin and enter the bloodstream. Although the mortality rate is low, schistosomiasis is a serious chronic illness. It is particularly devastating to children – it damages internal organs, impairs growth and causes cognitive impairment. After malaria, it is the most socio-economically devastating disease in the world.

Schistosomiasis can be treated by a drug called praziquantel that is inexpensive and is administered orally. The problem is that praziquantel tablets are very bitter to the taste and, consequently, many people do not complete the course of treatment. But praziquantel is an organic molecule that exists as two stereoisomers (stereoisomers are molecules that exist in two forms, one being the mirror-image of the other in much the same way as the is the left hand is the mirror-image of the right hand). Often in pharmacology, only one of the stereoisomers has the desired physiological effect and, indeed, this is the case with praziquantel. The "R" stereoisomer kills the parasite and does not have an unpleasant taste. The "S" stereoisomer is inactive and, fortuitously, is entirely responsible for the bitter taste. So why not simply make the R-form? Unfortunately, both forms are produced together in the reactions which are commonly used for synthesising this drug and are not easily separated in the manufacturing process. The best solution is to find catalysts and reaction conditions that favour the production of the desired stereoisomer over the other. However, there is no public funding available for the research and private enterprise will not fund it because the drug is so cheap that the financial return too low.

Another problem is that the normal research paradigm is sequential: a research grant is awarded; the work is done; the results are published and if encouraging, will perhaps result in further research grant. This can be dreadfully slow and a far more efficient way of solving complex problems of this nature is to have collaborative research that can proceed concurrently rather than sequentially - parallel rather than serial processing, as it were. There are number of examples of this type of collaboration being successful in areas such as astronomy, mathematics and biology. Dr Todd and his group at Sydney University explored using the open science approach to develop the manufacturing approach for the active, tasteless R-stereoisomer of praziquantel.

This approach resulted in rapid progress through collaboration of groups around the world, with at least two routes identified as potential practical manufacturing steps.

Dr Todd argues that the whole process of science is based on openness, the sharing of results and collaboration. Issues around patterns can be important but many of the key discoveries of the last century or so have not been subject to patent protection.


1215th Ordinary General Meeting

Wednesday, 2 October 2013

"Astrobiology: the latest from Curiosity" - Professor Malcolm Walter

"Seven seconds of terror" was how the operators at the Jet Propulsion Laboratory in the US describe the landing of 'Curiosity', the latest rover mission that landed on Mars in August last year. In the last stage of the landing, the entry vehicle hovered about 80 m above the surface of Mars and lowered Curiosity (which weighs nearly a tonne) by cranes to a gentle touch-down. Given that it can take up to 20 minutes for signals to reach Mars (or up to a 40 minute round-trip) there is a significant delay that constrains the Earth-based control station.

The purpose of the Curiosity mission is to understand the geological and biological context to determine whether life may have existed or, indeed, still exist on Mars. Mars is somewhat smaller than the Earth with the surface area of Mars being about the same as the exposed surface area of the Earth's continents. Until as recently as 60 years ago, up it was thought that advanced life may have once existed on Mars and could have been responsible for the canals and other geological phenomena that have been observed through telescopes. It is now thought that the most advanced form of life to be possible on Mars would be single cell organisms, probably similar to those that existed on earth in the early stages of life. To put this in perspective life first appeared on earth about 3,500 million years ago and, until about 500 million years ago, consisted entirely of single cell organisms. Nearly all of the diversity of life on earth is microscopic, so it makes sense to look for this as the first signs of life in other places in the universe.

One way to understand what early life might look like is to examine geological formations in very old rocks, such as the 3,500 million-year-old rocks in the Pilbara. Fortunately, these rocks are of great interest to geologists because they often hold valuable mineral deposits, so quite a lot is known about them. They are known to have been formed by volcanic action, so a second, complimentary approach is to see what forms of life exist in active volcanoes. One such volcano is White Island in New Zealand. Single cell life forms have been found there in water up to 123°C, so it is now known that life can exist from about -30°C to over 120°C.

In order to try to understand the evolutionary context of these single cell organisms, biologists look at bio-markers in the geological samples that are characteristic of life and see how these evolve. This is analogous to looking at skeleton evolution in more advanced life forms. Already, a great deal has been learned about the geological environment on Mars. An early mission, Phoenix, found ice at northern latitudes. The channels suggest that there was flowing liquid at one point in Mars' geological history. That was almost certainly water. Imaging shows that there is still channel formation taking place on the surface of Mars now which suggests that at times at least there is fluid flow. It is too cold for pure water, so if indeed this turns out to be due to rivers, they would have to be highly saline to be liquid at these temperatures.

Earlier investigations suggested that there was methane in the Martian atmosphere, however Curiosity has found none. The earlier observations are now thought to be due to a C-13 isotope of methane in the earth's atmosphere.

Curiosity is an extremely expensive mission – it takes 265 people every day to keep it running but the contribution to our understanding of Mars and the origins of the solar system and, by implication other phenomena in the universe is enormous. There are a further 15 missions planned by various public and private agencies over the next decade or so.


1197th General Meeting

"Grid-connected energy storage: the key to sustainable energy?"

Professor Tony Vassallo, Delta Electricity Chair in Sustainable Energy Development, School of Chemical & Biomolecular Engineering, University of Sydney, NSW 2006

Wednesday 2 November 2011 at 6.30 pm

Lecture Theatre 106, New Law Building, University of Sydney

Many countries in the world are committing large amounts of research resources to the development of sustainable energy generation technologies. One major disadvantage in using electricity as an energy source is that it is difficult to store. Renewable energy sources have the added problem that they are only available at certain times. For example, solar energy is only generated when there is strong sunlight.

At the meeting of the Society in Sydney on 2 November, Professor Tony Vassallo, the Delta Energy Professor of Sustainable Energy Development at the University of Sydney, gave a comprehensive coverage of the issues, challenges and potential advantages of having energy storage that can be directly connected to the electricity distribution grid.

There are important technical and economic reasons for wanting to store energy so it is quickly accessible to the consumer via the grid. Electricity demand varies quite substantially over the day, with this pattern also depending on the time of year. In summer, air-conditioning loads in the afternoon are high, while in winter loads peak in early evening and early morning. Most of Australia's electricity is generated in large, coal-fired power stations and these can take hours to react to changes in demand, so for these to be able to respond without the risk of blackouts, a lot of energy is wasted. Currently, the only means of providing reasonably responsive energy to the grid is via the Snowy Mountains hydroelectric system.

Many technologies are currently being developed to provide energy storage capacity. These include thermal storage using molten salt (for example, in Spain), hydroelectric storage, compressed air, superconducting magnets, ultra-capacitors, high-energy/high-efficiency flywheels and a range of battery technologies.

One promising technology avenue is integrating battery technology with renewables. For example, used in conjunction with wind energy generation, battery storage can reduce short-term fluctuations and allows dispatch when the load is high. It also allows a higher proportion of the total wind generation capacity to be included in the calculation of base-load capacity and lowers the capital cost of transmission equipment because the variability in load is reduced. The question Professor Vassallo addressed was: is battery technology feasible?

Currently large battery banks have been installed in pilot installations in other parts of the world, for example a 34 MW battery bank in a 50 MW wind-farm in Japan. But it may not be necessary to install such large battery banks that have high capital cost. For example, batteries can be distributed throughout the grid to zone substations and local substations. Another innovative concept is to use the batteries in electric cars to provide storage – during the times when demand is high and cars are not being used (for example early afternoon on a hot day), car batteries connected to the grid could provide localised storage capacity. Commercial models of these concepts are currently under development.

Professor Vassallo's own research programme relates to developing advanced battery and super capacitor technologies, such as graphene/nanotube capacitors, the use of regenerative fuel cells and the role of distributed storage in electricity networks.


1196th General Meeting

"Sex in the sea: how understanding the weird and bizarre sex lives of fishes is the first step to their conservation"

Professor Bill Gladstone, Head of the School of the Environment, University of Technology Sydney

Wednesday 5 October 2011 at 6.30 pm

Lecture Theatre 106, New Law Building, University of Sydney

In 1938 the pioneer deep sea explorer William Beebe described the sex life of anglerfishes as "sheer fiction, beyond all belief unless we have seen the proof of it". Beebe would be equally amazed today by the even more diverse reproductive strategies of fishes that have been discovered, and how this understanding is applied for conservation. This seminar will cover some of the more weird and bizarre examples of the sex lives of fishes from the deep sea to the Red Sea, the evolutionary pressures, and how this science of sex in the sea is being used for conservation purposes.


1195th General Meeting

"Distributed small-scale production of chemicals - why and how"

Professor Brian Haynes

Wednesday 7 September 2011 at 6.30 pm

Seminar Room 102, New Law Building, University of Sydney

In the last two decades tens of thousands of jobs have been lost from the Australian chemical manufacturing sector. As Professor Brian Haynes of the School of Chemical and Biomolecular Engineering at the University of Sydney explained, there are a number of reasons for this. In Australia, feed-stocks often are in remote locations, the nation is geographically remote from large global markets, Australian industry has traditionally had a low R&D expenditure, and the domestic market often does not justify investment in world-scale manufacturing capacity. Nonetheless, sales of chemicals and pharmaceuticals in Australia amount to tens of billions of dollars per annum and contribute significantly to Australia's balance of trade deficit. Professor Haynes' group at the University of Sydney has been working on technologies that might change this situation dramatically. 

One of the important reasons that chemical plants are so big is that the relative capital cost per unit of production drops substantially as plant capacity increases. Historically, large plant capacity has been achieved by designing and building very large production equipment. This solves the capacity/cost problem but introduces other major costs and inefficiencies. In particular it is much more difficult to control chemical reactions in large reactors (so impurities and by-products are produced and have to be dealt with) and, often, energy efficiency is compromised. An alternative being explored by Professor Haynes' group is to used advanced reactor design technologies to make relatively small and highly efficient manufacturing processes that are scalable simply by adding more of them rather than by building very large production equipment. This approach enables production capacity to be located near feed-stocks or customers, capital costs are much lower, the process has much reduced environmental impact, is safer to operate and is more energy efficient. 

This "process intensification" approach to chemical reactor design uses technology that is analogous to that used in printed circuits. By etching or engraving small channels in plates of stainless steel (or other alloys) and stacking and then fusing the plates, pipework, heat exchangers and reaction vessels can be formed. Because of their very small size, control of reaction kinetics, heat transfer and mass transfer can be very precisely controlled. 

One way of achieving this is to design a series of small reactors known as "multiple adiabatic beds" laid out with heat exchangers between each bed. This enables maximisation of the heat generated during a reaction and gives very high energy efficiency. One important industrial process where this approach is being used is in "methane-steam reforming" in which methane and steam are reacted first to form carbon monoxide and hydrogen, and then the carbon monoxide being further reacted with steam to form carbon dioxide and hydrogen. For every mole of methane used, four moles of hydrogen are produced. Large processes currently use steam to reform natural gas (which has contains a high proportion of methane), producing large quantities of hydrogen for industrial use. In large-scale industrial processes, there is a great deal of heat wasted but using the process intensification approach, much greater energy efficiency is achieved. The group at Sydney University has demonstrated this process on a pilot unit which is both scalable and, unlike large industrial processes can be started in a matter of a couple of hours. 

There are number of other important processes that are used on very large scales to make industrial chemicals where this technology can be employed. These can be ideal for relatively small industrial economies like Australia and other markets remote from large-scale plants.


Joint meeting with the Australian Academy of Forensic Sciences

Thursday, 19 February 2014

"Searching for clues: Unmasking art fraud and fraudsters" - Associate Professor Robyn Sloggett

At the first joint meeting of the Society and the Australian Academy of Forensics Sciences, Professor Robyn Sloggitt explain the approach taken by forensics scientists in investigating prosecuting cultural heritage offences. The difficulty that faces authorities is determining whether or not cultural records are true and verifiable. Forensic examination used in these situations follows the Locard principle (named after Edmond Locard the pioneering French forensics scientist) that "every contact leaves a trace".

In order to determine the provenance of works of art, the forensics it seeks to establish how the object was made, what it is made of, when it was made and where it was made. But it is not attempt to determine who made it. An important foundation of provenance is establishing a body of work that can be used as a reference. Here, art and science converge. For example, in order to determine whether or not a painting might have been painted by Rembrandt, it is known that Rembrandt lived from 1608 to 1669, that certain types of pigments but not others were available in that era, canvas and other materials need to be consistent and so on. But none of these, on their own although necessary, can be sufficient to establish authenticity. As philosopher, Karl Popper put it testing cannot establish authenticity, rather it can falsify it.

When forensics science is used to determination whether there has been fraud take place a number of questions have to be answered: was the financial benefit question? was there deliberate deception? was a work that is non-authentic passed off as being authentic? Professor Sloggitt referred to a notable case in which the Australian painter and art copyist, William Blundell, painted works in the style of a number of well-known Australian artists. He referred to these works as "innuendos", stating that they were intended be used for decorative purposes only were not passed off as originals typically being sold for only a few hundred dollars and us, there was no intention to defraud.

When authentication is required, the forensics approach is accommodation of scientific analysis, gathering historical facts, attempting to verify the provenance, weighing up evidence against the item being authentic these are considered together in order to reach a determination that in the balance of probabilities as to whether or not the item is indeed authentic. This depends on the availability of good databases and a logical development of the case, with corroborative evidence and expert knowledge and opinion. The forensics process can be considered to be in two parts: investigation of primary sources; and secondary sources. Primary sources are either invasive or non-invasive, with invasive techniques including methods such as laboratory-based analysis of materials. Non-invasive methods such as spectroscopy, x-ray diffraction, electron might be, Rahman and Fourier infrared analysis in recent decades have become important tools.

Typically, the first steps are to examine documents regarding the artefact and then investigate materials, such as the frame, paints, brushstrokes, and finishing techniques. Contaminants can also be useful such as pollen, dirt, fingerprints, as are ageing characteristics and the effect of the environment. Later changes can also be important. Secondary sources are then investigated and included observations such as style and technique but these are more difficult to deal with as they are subjective and often expert opinion is divided.

Professor Sloggitt gave some insight into a number of notorious fraud cases, one being Wolfgang Beltracchi who forged over 200 works that were passed off as being pre-World War II works by famous European artists such as Max Ernst, Heinrich Campendonk and Fernand Lédger. The amount involved was $48 million over 15 years and resulted in a gaol term of six years, considered to be a rather light sentence. In Europe, art fraud cases are relatively common but in Australia are quite rare. One reason for this that there is no art fraud squad in Australia, so criminal prosecutions are rare because is no professional expertise in identifying and tracking down criminal art fraud cases and taking them to prosecution.

There was an interesting discussion after the talk that continued over the first joint dinner enjoyed by both the Society and the Academy members.


Four Societies Lecture 2014

Thursday, 27 February 2014

"Questions of power in NSW"

Professor Mary O'Kane, NSW Chief Scientist and Engineer

At the annual Four Societies Lecture, Professor Mary O'Kane considered the major questions that face NSW in the future of energy production and utilisation. Asking the right questions is key – it reduces the time taken to identify the best solutions.

Australia is the ninth largest energy producer in the world and one of only four net energy exporters. We have 38% of the world's uranium, 9% of the world's coal and 2% of the world's gas. In terms of consumption, agriculture takes 3%, mining 13.5%, manufacturing and construction 25%, transportation 38% and residential about 11%. The 2014 Commonwealth Energy White Paper is seeking to address a number of questions regarding Australia's energy future. These include security of energy sources, the role of government and regulatory implications, growth and investment, trade in international relations, productivity and efficiency and alternative and emerging energy sources and technologies.

A recent report by the Grattan Institute identified a number of important issues. Australia has a lot of gas and coal, yet has yet to fully consider the impact of having no clear climate change policy. There is also the question of how can the electrical system (particularly one based on large generation units interconnected by a grid) meet the challenge of occasional very high peak demand. The Grattan Institute also posed questions around the balance of market and regulation and the importance of getting this right and explored the implications of new technologies and whether these provide potential solutions.

Australia is not unique in facing these challenges. One approach being taken in the US has been to establish an energy agency using a model was originally conceived for advanced research projects for the defence industry. ARPA-E, or the Advanced Research Projects Agency-Energy and was established to fund high risk/high reward research to identify new technologies for energy in the US. The research programmes in their portfolio relate to reconceiving the grid, the impact of micro grids, the impact of analysing big data, the gas revolution, new ways to get higher efficiencies, entirely new technologies, the best policy settings to encourage the adoption of new technologies and innovative models for research and development. Perhaps these sorts of approaches need to be utilised in NSW.

Questions that need to be addressed are what about nuclear energy? To what extent is geothermal energy applicable? How should we gain new efficiencies? How can we better optimise grid storage and geometry? What are the downsides of these various technologies? Are there opportunities to directly and export to our immediate neighbours (e.g. Indonesia)? How effective is Australia's energy R&D?

Professor O'Kane summarised the issues in three searching questions. First, how do we characterise a system that we want and the process to realise it? (What are the most important characteristics that our energy future must have, would be nice to have? What energy futures do we definitely not want?) Second, who should be responsible for demonstrating new technologies (responsible for progress, experiment, scale up, economic model and "energy equity")? And third, how can we have the best system possible? We must become expert at asking the questions and seeking solutions around the world and, importantly, developing solutions locally where appropriate in order to create a leadership position.


1219th Ordinary General Meeting

Wednesday, 5 March 2014

"Big data knowledge discovery: machine learning meets natural science"

Professor Hugh Durrant-Whyte FRS, CEO, National ICT Australia

Hugh Durrant-Whyte is an internationally-recognised expert on the analysis of "big data" – the mass of information that is being generated around current information and communication technologies. Much of this is "metadata" – data that is captured as part of some activity (for example, when a digital photograph is taken also recording camera settings, capture date etc or the data kept by telecommunication companies every time a mobile phone call is made).

2.5×1018 bytes of data are generated every day – there is immense value in mining this data but this requires sophisticated analytical techniques. "Data analytics" is the term coined for technologies to analyse this data in areas as varied as the finance industry, the health industry, planning infrastructure, failure analysis in mechanical and electronic equipment and environmental analysis, to name but a few examples. Data analytics utilises Bayesian probability theory (named after Rev Thomas Bayes, an 18th century mathematician) to prepare quantitative models of existing data, gathering new data to address remaining problems and then updating model to incorporate both the old and new data.

Data analytics can be modelled using three types of mathematical functions: discrete functions that describe, for example, events or people's actions; finite probability functions, such as signals or locations and infinite probability functions such as spatial fields or temporal fields. As the masses of data available increase, the analysis can converge on knowledge. For example, payment patterns exhibited by individuals can be aggregated to behaviours of bank branch customers, giving an understanding of consumer behaviour. On the other side of the table, customers can utilise masses of data to take advantage of the best deals available or to customise internet-based content that they may wish to buy.

Where masses of historical data are available (for example, managing water assets) readily available historical parameters can be analysed for such applications as predicting equipment failures. In the case of water asset management, pipe age, soil type etc can be analysed to give a probabilistic analysis of when a water main might fail.

The mining industry has invested large amounts of money in developing systems to utilise masses of existing information to automate mine operation. This can take all available data around the surface of the mine, the subsurface, mapping, drilling, to create a completely integrated data model into a single, real-time representation of the mine.

The purpose of National ICT Australia (NICTA) is to utilise these data analytics approaches to produce leading-edge technologies and models for such varied applications as financial modelling, creating large-scale fully integrated data maps of regions (perhaps even as large as continental Australia). There is also a particular focus on data-driven discovery in the natural sciences in applications as varied as modelling ancient plate tectonics to predict mineralisation (on a timeframe of as much is 1.5 billion years) or ecological modelling, for example, predicting the growth of trees. Ultimately, these may be able to be integrated into one massive model of the Australian continent.


1220th Ordinary General Meeting

Wednesday, 2 April 2014

"The Jameson cell"

Laureate Professor Graeme Jameson AO

At the 1220th ordinary general meeting of the Society, Laureate Professor Graeme Jameson described the development of the Jameson cell, one of the most important technological contributions to the Australian economy in the last 50 years.

The Jameson cell is a flotation cell used to concentrate the valuable components on ore in minerals processing. In a typical mining operation, the first two stages of extracting minerals are the mine itself from which the ore is recovered and the concentrator, where the valuable mineral is extracted from the rest. Generally, the valuable components are no more than 2% of the ore recovered, so there is a massive challenge in isolating this from spoil for further processing. An important technology developed to achieve this concentration step was the flotation cell, a process first developed early in the 20th century.

In a flotation technology, the ore is ground up into very fine particles and dispersed with water and surfactants in a large mixing vessel that can be kept agitated and into the bottom of which compressed air can be introduced. Reagents are added to make hydrophobic the valuable mineral particles exposed during the crushing. Air is bubbled through the suspension and the hydrophobic mineral particles attach to the bubbles, float to the surface as a froth and then are skimmed off for further processing and enrichment. Because large volumes of ore have to be treated to recover a relatively small proportion of valuable product, this is a very expensive step in recovering minerals: first, the ore has to be ground to very fine particle sizes (typically around 150 micrometres) – this takes a lot of energy; and second, the volume that has to be treated in preparing the slurry is large, so processing equipment is big and expensive. Any technology that reduces either the cost of grinding or the size of the processing equipment can have a major impact on the cost of production. The Jameson cell revolutionised the floatation process by reducing the size of the equipment needed to efficiently float off the minerals.

Over a period of several years, Professor Jamieson identified the optimum parameters for particle size and the corresponding optimum size for the air bubbles used to float the treated particles. Generally, particle size needs to be less than 150 micrometres, or, even better, less than 100 micrometres. The smaller the particle, the more likely it is to consist of the pure mineral. But the real technological breakthrough was identifying that the optimum bubble size is about 300 micrometres. Until then, conventional cells operated using bubbles about three times that size at about 1 mm diameter. Having identified the optimum bubble size, the challenge was then to design equipment that produced the right amount of sheer to generate bubbles of 300 micrometres diameter . This turned out to be relatively simple, using high pressure jets of water to entrain the air.

Much of the commercialisation work was done at Mount Isa in the 1980s and 1990s. Since then, the cell has been deployed around the world and is used routinely to extract coal, copper, lead, zinc and potash and is used in other industries such as oil-sands extraction and industrial waste treatment. The over 300 cells have been installed and the cumulative value created by this invention is more than $25 billion.

Professor Jameson was named NSW Scientist of the Year in 2013.


1194th General Meeting

"Schizophrenia: from neuropathology to new treatments"

Professor Cyndi Shannon Weickert, Macquarie Group Foundation Chair of Schizophrenia Research, Neuroscience Research Australia and UNSW, and Professor, School of Psychiatry, UNSW

Wednesday 3 August 2011 at 6.30 pm

Seminar Room 102, New Law Building, University of Sydney

Is schizophrenia caused by genes or environment? This question was posed by Professor Cyndi Shannon Weickert at the 1194th ordinary general meeting of the Society. 

Schizophrenia was first formally classified in 1887. Despite extensive pathological investigation there was no clear distinction identified between the brains of people who have schizophrenia and those who do not. Until 1930s it was considered to be primarily a behavioural disorder put down to bad mothering. But in the 1930s treatments involving insulin and shock therapy were shown to be somewhat effective. There was a breakthrough in 1952 when D2R blockers were introduced and found to be effective against some of the symptoms. However, it was not until 1988 that the first definite genetic link was established by progress was swift and in the last decade it has been shown that there may be several hundred genes involved in the disorder. Because of the large number of genes that are implicated, identifying treatments that target these genes is extraordinarily complex. Most researchers in the field now believe that the disease has both environmental and genetic origins. 

The approach taken by Professor Shannon Weickert's group is to attempt to identify the pathology of various genetic pathways to the disease, in particular identifying molecules that can be new drug targets. Once these have been postulated, the aim is to use existing drugs which are either known to or believed to affect those targets and then to test their effect in clinical trials. This approach has the advantage of using drugs that have already been approved for use in humans thereby avoiding the necessity for time- consuming and expensive early-stage clinical trials that establish general parameters such as toxicity and dosage levels. 

One notable aspect of schizophrenia is that it is virtually never found in children prior to adolescence. Most cases of schizophrenia are diagnosed from mid-teens to the early 20s but, interestingly, there is a second peak among women at menopause. This suggests that sex hormones could be an important part of the mechanism causing the disorder. Oestrogen receptors are found in the human cortex and act as "transcription factors", that is, they transport proteins across the cell membrane into the nucleus of the neuron. On investigating oestrogen receptor proteins a mutation specific to schizophrenia has been found in a transcription factor protein called ESR1. This protein cannot bind to oestrogen and hence cannot pass hormonal signals into the nucleus of the cell. Hence, the cell cannot activate important genes that produce their normal proteins and this may cause some of the symptoms of schizophrenia.

 An existing drug, raloxifene, has already been approved as a selective oestrogen receptor modulator for treating various disorders in postmenopausal women. Raloxifene has been found to stimulate the oestrogen receptor and overcome the mutant effect in the ESR1 gene. However, the great variability of genes means that the drug effect on one specific mutation is likely to be masked, so there needs to careful design of clinical trials to make the effect apparent. One such trial is currently being conducted by Professor Shannon Weickert's group and involves a double-blind trial in which patients and control groups are treated in two stages, with all trial participants receiving the drug in one or other of the stages. This clinical trial is still under way and is expected to be completed towards the end of this year. If successful it may be a major step in establishing personalised drug treatments for the 1% of the human population that currently suffers the debilitating effects of schizophrenia. [The May 2009 edition of the ABC's Australian Story was on Professor Shannon Weickert's work and is available at Anyone interested in the clinical trial may be interested to read the transcript or to view it.]


1222nd Ordinary General Meeting

Wednesday, 4 June 2014

"What lessons have we learnt from the Global Financial Crisis?"

Professor Robert Marks

In 2008, the world suffered "the equivalent of cardiac arrest", according to the Financial Times. It became virtually impossible for any institution to finance itself, (that is, borrow in the markets) longer than overnight. With the collapse of Lehman Bros, interbank credit markets froze and counterparty risk was considered to be too great for prospective lenders to take on the transactions. The London interbank overnight lending rate, typically in the range of 0.2% to 0.8% spiked to over 3%. This situation raises two questions: what caused this global financial crisis (GFC)? and how can we attempt to avoid similar crises in the future? The origins of the crisis go back more than 30 years.

Starting in 1977, there were substantial changes made to US investment legislation. Early in this period, the aim was to make finance more readily available to low-income borrowers, to progressively eliminate using the controls on mortgage rates and to remove discrimination in the US housing market. In 1999 and 2000, there was substantial deregulation, with substantial changes to long-standing legislation, in particular the repeal of the Glass-Steagall act of 1933 that had imposed restrictions banks during the Great Depression. There were also reforms to the Federal housing finance regulatory agencies, loosening their lending requirements.

This period of financial deregulation encouraged consolidation and demutualisation of many financial institutions that had been mutually or privately owned, with these being floated as public companies. Whereas previously their lending practices had been conservative as they had been risking their own money, now the money at risk belonged to other people! There was also great creativity in developing new financial products and instruments: Mortgage-Backed Securities (MBS), structured investment vehicles, Credit Default Swaps (CDS) and Collateral Mortgage Obligations (CMO).

In the early 2000s, the September 11 attacks, coming not long after the bursting of the "tech bubble", led to a prolonged period of low interest rates. US fiscal policy was heavily in deficit leading to massive issuance of US bonds that were largely bought by China and other Asian countries. At the same time there was further financial deregulation, relaxing capital requirements that encouraged higher gearing financial institutions.

Unsurprisingly, firms responded to the incentives put before them. The market for the new financial instruments boomed and rating agencies responded by changing the way in which they charge for their services – they began charging the firms whose products they were rating, rather than the potential buyers of the product. In the US, the financial sector grew from 3.5% of GDP in 1960 to nearly 8% of GDP in 2008.

Drawing these strands together, there were four causes of the GFC: the repeal of the Glass-Steagall act; the decision by Congress not to regulate derivatives; the relaxation of regulations that allowed banks to expand their gearing; and the change by the ratings agencies to charge the issuer rather than the buyer of rated products.

How likely is this type of situation to occur again in the near future?

Unfortunately, a number of European countries may be facing similar challenges unless they take steps to avoid the problems that the US experienced. Fortunately, Australia avoided the worst of the GFC, well-served by the "four pillars" banking policy. However, there needs to be recognition that information is asymmetric and that the issue is really not one of risk but rather of uncertainty, where there are no simple answers. As George Santayana observed in 1905, "those who cannot remember the past are condemned to repeat it".


1223rd Ordinary General Meeting

"What causes MS? The impact of the genetic revolution"

Professor Graeme Stewart AM

Wednesday, 2 July 2014

Professor Graeme Stewart AM, director of clinical immunology at Westmead Hospital, has researched the genetic influences on disease, in particular on multiple sclerosis (MS). MS is the commonest chronic neurological disorder of young at all. It usually starts with a relapsing/remitting phase (symptoms occur and then go into remission for extended periods), commonly with onset at about the age of 30. The disease can be relatively benign with periods of disability, it can present as a relapsing/remitting disease with gradual increase in disability, or in about 10-20% of patients it can present as being "primary progressive", where disability progressively increases over time. MS is caused by the body's immune system malfunctioning – macrophages devour the myelin sheath around nerve cells, exposing the nerve axon and thereby disrupting the flow of information along the nerve cell. The body is able to repair the damage by re-myelinating the nerve cells after this initial attack however if the myelin is attacked the second time in the same place, the body is unable to repair the sheath and relapse occurs. Hence the symptoms of the disease progress.

The important question is: what causes this? MS is a disease which is clearly influenced by genes and environment. Studies of the disease in identical twins show 30% concordance, whereas in fraternal twins there is 4% concordance. The background incidence rate of MS is 0.4% of the population. This suggests that genetic influences are very significant but environmental factors are also a consideration. The interesting environmental effect is that the incidence of MS is quite highly correlated to latitude – for example, in Australia there is a 4 to 7 times hazard ratio between North Queensland and Tasmania – in the southern hemisphere, the further south you live, the more likely you are to contract MS. The most likely reason for this is the reduced exposure to UV-B light the further you are from the equator and vitamin D deficiency. Other environmental factors include smoking and exposure to the Epstein-Barr virus that causes glandular fever (almost all MS patients have been affected with Epstein-Barr virus). The fact that Epstein-Barr virus is implicated in virtually all MS cases may present an opportunity for treatment if the effect of this virus on DNA is understood.

Since the early 1970s, there has been a search for the genes implicated in MS. The first was found in 1972 but it was not until 2007 that the second gene was identified. Since then, as a consequence of the human genome project and widespread sequencing technology, together with the recent advances in computer power and statistical algorithms to handle large amounts of data, there have been over 100 genes identified. Pursuing genetic associations is expected to give insight into the pathogenesis, in particular the interaction between genes and environment. It is hoped that this will lead to interventions to prevent the disease from progressing. In addition, identifying genetic biomarkers may provide major opportunities for new treatments, including personalised treatments based on the individuals genetic profile.

There has been substantial progress in treatments for MS, including trials of drugs to stop T cells crossing the blood-brain barrier, drugs that capture lymphocytes and hold them in the lymph nodes and early indications that drugs targeting specific proteins identified through genetic analysis might be useful. In addition, trials are underway to see whether large doses of vitamin D might have some impact and whether increased exposure to ultraviolet light might also offer some improvement.

Royal Society Events

The Royal Society of NSW organizes events in Sydney and in its Branches throughout the year. 

In Sydney, these include Ordinary General Meetings (OGMs) held normally at 6.00 for 6.30 pm on the first Wednesday of the month (there is no meeting in January), in the Gallery Room at the State Library of NSW. At the OGMs, society business is conducted, new Fellows and Members are inducted, and reports from Council are given.  This is followed by a public lecture presented by an eminent expert and an optional dinner.  Drinks are served before the meeting.  There is a small charge to attend the meeting and lecture, and to cover refreshments.  The dinner is a separate charge, and must be booked in advance.  All OGMs are open to members of the public.

Since April 2020, during the COVID-19 pandemic, face-to-face meetings have been replaced by virtual meetings, conducted as Zoom webinars, allowing the events program to continue uninterrupted.  It is hoped that face-to-face meetings can be resumed in late 2020. 

The first OGM of  the year, held in February, has speakers drawn from the winners of the Royal Society Scholarships from the previous year, while the December OGM hears from the winner of the Jak Kelly award, before an informal Christmas party.  The April or May event is our black-tie Annual Dinner and Distinguished Fellow lecture.

Other events are held in collaboration with other groups, including:

  • The Four Societies lecture — with the Australian Institute of Energy, the Nuclear Panel of Engineers Australia (Sydney Division), and the Australian Nuclear Association
  • The Forum — the Australian Academy of Science, with the Australian Academy of Technology and Engineering, the Australian Academy of the Humanities, and the Academy of the Social Sciences in Australia
  • The Dirac lecture — with UNSW Sydney and the Australian Institute of Physics
  • The Liversidge Medal lecture — with the Royal Australian Chemical Institute

Sydney meetings 

Hunter meetings

Southern Highlands meetings



Details of events scheduled for the remainder of the current year by the Southern Highlands branch can be found on its website.

Details of past events held by the Southern Highlands branch can be found here.

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.