Trusting the Computer in Computer Vision: A Privacy-Affirming Framework

Show simple item record

dc.contributor.author Chen, Andrew en
dc.contributor.author Biglari-Abhari, Morteza en
dc.contributor.author Wang, Kevin I-Kai en
dc.coverage.spatial Honolulu, Hawaii, US en
dc.date.accessioned 2017-09-03T22:45:47Z en
dc.date.issued 2017-08 en
dc.identifier.issn 2160-7508 en
dc.identifier.uri http://hdl.handle.net/2292/35459 en
dc.description.abstract The use of surveillance cameras continues to increase, ranging from conventional applications such as law enforcement to newer scenarios with looser requirements such as gathering business intelligence. Humans still play an integral part in using and interpreting the footage from these systems, but are also a significant factor in causing unintentional privacy breaches. As computer vision methods continue to improve, we argue in this position paper that system designers should reconsider the role of machines in surveillance, and how automation can be used to help protect privacy. We explore this by discussing the impact of the human-in-the-loop, the potential for using abstraction and distributed computing to further privacy goals, and an approach for determining when video footage should be hidden from human users. We propose that in an ideal surveillance scenario, a privacy-affirming framework causes collected camera footage to be processed by computers directly, and never shown to humans. This implicitly requires humans to establish trust, to believe that computer vision systems can generate sufficiently accurate results without human supervision, so that if information about people must be gathered, unintentional data collection is mitigated as much as possible. en
dc.description.uri http://vision.soic.indiana.edu/bright-and-dark-workshop-2017/ en
dc.publisher IEEE Xplore en
dc.relation.ispartof The First International Workshop on The Bright and Dark Sides of Computer Vision: Challenges and Opportunities for Privacy and Security (CV-COPS 2017), Computer Vision and Pattern Recognition (CVPR) en
dc.relation.ispartofseries 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) en
dc.rights Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. Previously published items are made available in accordance with the copyright policy of the publisher. en
dc.rights.uri https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htm en
dc.title Trusting the Computer in Computer Vision: A Privacy-Affirming Framework en
dc.type Conference Item en
dc.identifier.doi 10.1109/CVPRW.2017.178 en
pubs.begin-page 1360 en
dc.description.version AM - Accepted Manuscript en
dc.description.version 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE Xplore. 1360-1367. Aug 2017 en
dc.rights.holder Copyright: IEEE Xplore en
pubs.end-page 1367 en
pubs.finish-date 2017-07-21 en
pubs.start-date 2017-07-21 en
dc.rights.accessrights http://purl.org/eprint/accessRights/OpenAccess en
pubs.subtype Proceedings en
pubs.elements-id 629228 en
pubs.org-id Engineering en
pubs.org-id Department of Electrical, Computer and Software Engineering en
dc.identifier.eissn 2160-7516 en
pubs.record-created-at-source-date 2017-06-10 en


Files in this item

Find Full text

This item appears in the following Collection(s)

Show simple item record

Share

Search ResearchSpace


Browse

Statistics