Can we explain AI with experiential? I say yes.
What we talk about when we talk about deciding: Notes from DAAG 2019
I’ll be at VentureBeat’s Transform AI conference July 10-11 in San Francisco. Let me know if you’re attending; would be great to meet. -Tracy It’s not always easy staying on the AI bandwagon. Claims of algorithmic bias abound and (mis)applications threaten people’s trust. Not everyone wants their face recognized or their driver’s license scanned. Developers […]
The skill set for explaining, XAI, and why they both matter.
Since my work is about humans+AI deciding together, I attended DAAG 2019 in beautiful downtown Denver, exploring the “intersection of decision analysis and data science to take decision-making to the next level.” The intent was for decision analysts to better understand data science and “support data-centric decision-making” while data scientists could better “guide the use […]
Machines Gone Wild! + Can Microlearning improve Data Science training?
As data complexity grows, so does the importance of explaining. The philosophy of science can teach us about the role of explaining in high-quality, evidence-based decisions. It’s not just navel-gazing: An explanation is a statement that makes something clear, or a reason or justification given for an action or belief. It describes “a set of […]
1. Machines Gone Wild → Digital trust gap Last year I spoke with the CEO of a smallish healthcare firm. He had not embraced sophisticated analytics or machine-made decision making, with no comfort level for ‘what information he could believe’. He did, however, trust the CFO’s recommendations. Evidently, these sentiments are widely shared. — […]