Radical Speaker Series:
Countering Weaponized Information
4 December 2018
On 4 December, SOFWERX, in collaboration with USSOCOM/J5 Donovan Group, hosted a Radical Speaker Series covering weaponized information. Mass influence operations, deep fakes, and social media metrics have been used by state and non-state actors in attempts to influence everything from public sentiment on policy issues to election results. The type and extent of influence operations has laid bare policy and technology gaps. This represents an emerging new threat vector for global competition short of armed conflict.
To view individual presentations, see below.
Part I: Dr. David Perlman
Dr. David M. Perlman studied applied physics at Caltech and Electrical Engineering at the University of Washington, worked in tech in Seattle during the dot-com boom, and then pursued his interest in cognitive science through a doctorate at University of Wisconsin-Madison with renowned affective neuroscientist Richard J. Davidson. He began with a focus on mechanisms by which meditation affects psychological well-being, which took him to India to present to His Holiness the Dalai Lama. Dr. Perlman’s thesis project combined techniques of behavioral economics and psychophysiology to quantitatively study the relationship of narrative and social identity with valuation and decision. His ultimate goal is to implement an “early warning system” to neutralize the dangerous new threat of hostile social media influence as a national security issue.
Dr. Perlman will be presenting on “An Introduction to Cyber-enabled Information Operations.”
We are all in an active global hybrid conflict. The kinetic battlefield is well-known; the “cyber” battlefield is new, and the conflict mode of cyber-enabled information operations (CyIO) is so new that many are only distantly aware of its existence, few understand the critical differences of CyIO from cyberwarfare, and fewer still know how to recognize and describe, let alone counter, CyIO operations. The battlefield of CyIO is “social media,” the internet layer of human-scale information that is emotionally and socially compelling, and as a result, CyIO cannot be understood or even discussed without a new synthesis of psychology, sociology, political economics and social choice theory, data science, and enterprise-level systems engineering. Dr. Perlman will draw upon these fields to introduce an overview of CyIO, how it relates to other domains such as cybersecurity, propaganda, narrative and information warfare.
Part II: Scot A. Terban
Scot Terban has been in the business of INFOSEC since the late 90’s and started his formal career with IBM Security & Privacy in 2000. One of the founding set of members of the Security and Privacy team he worked for numerous fortune 100 and 500 companies performing Red Team exercises. After six years of nearly one hundred percent travel for IBM he decided to move over to Blue Team activities landing finally with a financial institution in charge of DFIR (Digital Forensics and Incident Response) and Threat Intelligence for 11 subsidiaries as well as the primary company. In his off time he writes on his blog about national security issues and other security oriented topics at krypt3ia.com. Through the blog Mr. Terban has over the years contributed to several counter terrorism efforts online as well as worked with authorities passing information to them from the web on jihadist bulletin boards and other corners of the clear and dark webs.
Terban will be presenting on “Your Algorithms Won’t Save You: Why We Need More Sociology and Psychology in The Fight Against Online Propaganda.”
Having spent a lot of time dealing in the past with online jiahdi media and particularly Twitter, the onslaught of disinformation and PSYOPS from Russia in 2016 wasn’t a surprise. However, the INFOSEC and NATSEC communities were both caught flat footed on how to respond effectively against it. In the years that Da’esh began leveraging Twitter as a means of propaganda/recruitment/comms it became clear that algorithms and other manual means to eradicate them from the platform were lacking. Now, with the IRA (Internet Research Agency) the same kinds of techniques were used by the GRU and SVR but with a more solid social and psychological foundation. With this we now have to face an onslaught of meme warfare and PSYOPS within the same echo chamber but the solutions so far seem to be only data based and not dealing with the human animals that propagate the messaging. In this presentation I will discuss the single threaded data denial model as opposed to perhaps a more hybrid solution that could mitigate the messaging from and to it’s targets by sociological and psychology as well as data models.
Part III: Dr. Matthew Sorell
Dr. Matthew Sorell is a senior lecturer in Telecommunications and Multimedia Engineering at the University of Adelaide and has recently been appointed Adjunct Professor in Digital Forensics at the Tallinn University of Technology (Tal Tech), Estonia. He is also a consultant to Australian law enforcement and an academic member of the INTERPOL Digital Forensics Experts Group. Dr. Sorell specializes in the analysis of criminal evidence derived from devices such as mobile phones and wearable devices, and complex audiovisual content. Dr. Sorell received a high commendation from the U.K. National Police Chiefs Council earlier this year for the development of emerging sources of digital investigation in the International Digital Investigation Awards.
Dr. Sorell will be presenting on “DEEPFAKE: Provenance Countermeasures.”
In response to the emerging and rapidly developing DEEPFAKE video techniques, a number of countermeasures have been proposed. These include the insertion of signatures (such as steganographic watermarking) and the analysis of video and audio content anomalies, for example by machine learning. This presentation will introduce source camera provenance techniques for still and video cameras and propose how these techniques may be applied to both the detection of video manipulation and the tracing of common supply chains.
Part IV: Dr. Irene Amerini
Dr. Irene Amerini is a senior postdoctoral researcher at Image Forensics and Security Lab, Media Integration and Communication Center, University of Florence, in Italy. In 2018 she obtained a Visiting Research Fellowship at the School of Computing and Mathematics at Charles Sturt University offered by the Australian Government – Department of Education and Training through the Endeavour Scholarship & Fellowship program. Dr. Amerini received a Ph.D. in computer engineering, multimedia and telecommunication from the University of Florence in 2011. Her research interests are focused on multimedia forensics, particularly in source identification, image and video forgery detection and localization.
Dr. Amerini will be presenting on “Strategies for Countering Fake Information: New Trends in Multimedia Authenticity Verification and Source Identification.”
In this talk forensic methodologies and tools employed so far to analyze digital evidences to ensure media authenticity will be outlined with special attention to the most promising solutions relying on Convolutional Neural Networks; an in-depth focus will be dedicated to realistic scenarios, such as the spreading of multimedia data over social networks. Finally, an overview of the recent trends and evolution will be provided giving an insight on new security issues related to deepfakes and computer generated images with GAN.
Part V: Sara-Jayne Terp
Sara-Jayne “SJ” Terp is a data nerd with a long history of working on the hardest data problems she can find. Her background includes designing unmanned vehicle systems, transport, intelligence and disaster data systems with an emphasis on how humans and autonomous systems work together; developing crowdsourced advocacy tools, managing innovations, teaching data science to Columbia’s international development students, designing probabilistic network algorithms, working as a pyrotechnician and CTO of the UN’s big data team. Her current interests are focused on misinformation mechanisms and counters, working with groups ranging from journalists and online advertising exchanges (her most recent roles were as a data scientist in an adtech exchange and a small-business lender covering most of the US) to political data scientists. SJ holds degrees in artificial intelligence and pattern analysis and neural networks. She founded Bodacea Light Industries to continue this work, and is currently working with the Global Disinformation Index to create an independent disinformation rating system, running a Credibility Coalition working group on the application of information security principles to misinformation, contributing to a W3C working group on credibility and writing a book on automating influence.
Part VI: Keith Dear
Wing Commander Keith Dear is a 16-year RAF intelligence officer, Chief of the Air Staff’s Fellow, DPhil candidate at Oxford University’s Department of Experimental Psychology, and Research Fellow at Oxford’s Changing Character of War Programme. He is co-executive producer and expert consultant to the Royal United Services Institute’s Artificial Intelligence & the Future Programme. His professional experience is in Intelligence, Surveillance and Reconnaissance (ISR) analysing human behaviours and systems in the UK, with the United States Air Force and on multiple operations overseas. In 2011 he was awarded King’s College London’s O’Dwyer-Russell prize for the highest MAs for his studies in Terrorism and Counter-Terrorism, focused on the efficacy of leadership targeting. He is a founding member and co-lead of the Defence Entrepreneurs’ Forum (UK) and founder and CEO of Airbridge Aviation, a not-for-profit start-up dedicated to delivering humanitarian aid by cargo drone originally focused on the Syrian sieges. Prior to commencing his current research at Oxford he was member of the UK’s cross-Government Counter-ISIL Task Force and headed the targeting cell at the UK’s Permanent Joint Headquarters.
Dear will be presenting on “The Science of Information Warfare.”
In the Science of Information Warfare, Dear will explore the implications, ethical concerns and opportunities raised by the increasing convergence of the human and life sciences on the idea that humans are no more than biochemical algorithms, reducible to input-output mechanisms. Dear will argue that this understanding, when paired with machine learning algorithms will enable the modelling and manipulation of human behavior and exploitation of intelligence at lightning speed, showing how psychology will be at the heart of future operational design.
Click on each link below for presentation
Submit Feedback: Click Here to Submit Event Feedback
For event related questions, please contact Kinsey Crim, email@example.com