How To Quickly Capture Ideas Into OmniFocus

 

Here’s my system for getting ideas into my OmniFocus task management system.

On my Mac, I use this keyboard shortcut:

OF Quick Entry Shortcut.png

To get this Quick Entry box:

OF+Quick+Entry.jpg

I don’t have to complete all the fields. Rather, I type just enough to remember the idea, hit Return, and my task goes to the OmniFocus inbox. I can later elaborate on it, create a project and next actions, and so on. Similarly, if I want to capture supporting material along with the idea, I use the clipper shortcut:

OF Send to Inbox.png

This opens a Quick Entry box along with the highlighted text, URL, or email — the text or link is captured in the Note field. I also use this method for linking back to To-Dos in Basecamp. Very useful!

Away from my Mac, I dictate or type text into Drafts on my iPhone, then send it to OmniFocus as an action. Alternatively, I use Siri to dictate a Reminder, which gets synced with OmniFocus. I process the task when I return to my Mac.

 

No One Has Been Paying Attention For A While Now: What Recent Experiences With Remote Juries Tell Us About Our Distracted World

 
In Court, Everett Collection

Some California courts are holding jury trials during the coronavirus pandemic. Logistics have been a difficult challenge. But the biggest problem — one far more consequential than any technical issue and more pervasive than what happens in legal proceedings — is many people’s inability to pay attention anymore.

In one case, a juror left his computer to attend to food on the stove. Another juror could be seen lying in bed. Jurors switched focus away from court proceedings to other screens, kids, pets, and whatever else was happening at home.

In other words, jurors did what everyone does during video meetings. Juror distraction is merely a special case of a general problem.

Here’s what we can do about it.

(Link to full essay, originally published by The Daily Journal, September 11, 2020)

 

How To Change The Legal Profession’s Culture Of Constant Availability

How To Change The Legal Profession’s Culture Of Constant Availability

On a recent episode of his podcast, Cal Newport was asked how “deep work” plays out at a law firm.

Based on his discussions with lawyers at different levels in their careers — from new associates to equity partners — Newport believes law firms are “terrible places to work” when it comes to facilitating unbroken concentration and “cognitive hygiene.” As he sees it, this is particularly unfortunate in a field so purely cognitive in its pursuits.

For most lawyers, the fundamental problem is the demand for constant availability — usually through email — a problem I’ve written about before. The frequency of network switching affects the quality and rate of production.

Why is this true, and how can we fix the problem?

Photo by Albert Barden. c. 1912, From the Albert Barden Collection, State Archives of North Carolina, Raleigh, NC. Photo: N_53_17_92, NC A&M Dairy Barn. Located on present-day site of Reynolds Coliseum

Disentangling Your Story: Letting Go And Developing A Growth Mindset About Technology

Disentangling Your Story: Letting Go And Developing A Growth Mindset About Technology

Photo credit: LH_4tography

Jack Kornfield writes about a woman on a mediation retreat in a redwood forest:

She awoke in the middle of the night startled, heart pounding, because she heard a loud growl just outside. She was sure it was a bear close by, perhaps dangerous. Turning on a small flashlight, she looked around and waited fearfully for the unknown growler to make another noise. At first it was quiet. Then a minute had passed, her stomach let out a loud growl. She realized that the bean soup from dinner was having its way with her digestive tract! The loud growl was herself.

Kornfield explains the benefits of mindfulness and the practice of noticing when we tell ourselves stories.

Sometimes our stories are useful, allowing us to structure the world and our identity. Many times the stories are objectively false and unhelpful.

In my profession, a common unhelpful story is: “I’m not good at technology.”

If you tell this story, here’s why you should let it go.

Facial Recognition Technology — Mid-2020 Roundup: Keeping The Focus On Social Media Companies

Facial Recognition Technology — Mid-2020 Roundup: Keeping The Focus On Social Media Companies

Photo credit: PHOTOCREO Michal Bednarek

Awareness is growing about algorithmic bias and other problems with how law enforcement uses facial recognition technology. Big-name technology companies recently announced self-imposed moratoriums. Congress may prohibit the use of this technology with body cameras and is even considering an outright ban in policing.

Meanwhile, what should we do about the social media companies and other businesses that continue to collect, store, and share our biometric data?

Great Client Service Depends On Clarity And Trust, Rather Than Being Constantly Available

Great Client Service Depends On Clarity And Trust, Rather Than Being Constantly Available

Photo credit: JakubD

For those lawyers working remotely during the pandemic, the interruptions of the office have been replaced by those at home: kids, dogs, and flushing toilets.

Along with these distractions, some are experiencing an increased temptation to be “constantly available” for our clients. There are no office hours. Our workstations are laptops and mobile devices. Our workdays theoretically never end.

But good lawyers — whether working from home or elsewhere — do not have to be constantly available to represent their clients well and keep them informed. Here’s why …

Hey: My Review Of Basecamp’s New Email Platform

Hey: My Review Of Basecamp’s New Email Platform

I’m finishing my two-week trial of Basecamp’s new email platform Hey.

I’ve used Basecamp with my legal team for several years and it works great for internal communication, project management, and other collaborative aspects of our work. So of course I needed to try Hey.

So how has Hey worked out?

Facial Recognition Software and Social Media

The taboo on facial recognition technology is eroding. And a new app called Clearview is increasingly being used by law enforcement and garnering attention. The New York Times reports:

Police departments have had access to facial recognition tools for almost 20 years, but they have historically been limited to searching government-provided images, such as mug shots and driver’s license photos. In recent years, facial recognition algorithms have improved in accuracy, and companies like Amazon offer products that can create a facial recognition program for any database of images.

Mr. Ton-That wanted to go way beyond that. He began in 2016 by recruiting a couple of engineers. One helped design a program that can automatically collect images of people’s faces from across the internet, such as employment sites, news sites, educational sites, and social networks including Facebook, YouTube, Twitter, Instagram and even Venmo. Representatives of those companies said their policies prohibit such scraping, and Twitter said it explicitly banned use of its data for facial recognition.

Some commentators view this as having crossed an ethical line and we should now ban facial recognition technology altogether.

Of course, the software would never work without the de facto cooperation of social media companies and all of us who freely share our images with them. This particular software apparently works so well because it has a much larger database of images as compared to the FBI and other law enforcement agencies.

There’s an irony of law enforcement solving crimes using what is, arguably, stolen property of those who intended to shared an image with their friends.

Update: Facebook’s own facial recognition technology remains in the courts. Via The Hill:

The Supreme Court on Tuesday declined to take up a high-profile court battle over whether users can sue Facebook for using facial recognition technology on their photos without proper consent.

The high court rejected Facebook's bid to review the case, meaning the social media giant will likely have to face the multibillion-dollar class-action lawsuit over whether it violated an Illinois privacy law.

The case, Facebook vs. Patel, hinges on a question over whether Facebook violated Illinois law when it implemented a photo-tagging feature that recognized users' faces and suggested their names without obtaining adequate consent. Facebook argued to the Supreme Court that the class-action case should not be allowed to proceed because the group of users have not proven that the alleged privacy violation resulted in "real-world harm."

Further Update: The New York Times reports that Facebook to agreed to pay $550 million to settle the facial recognition suit.

Smart Ovens Unpredictably Turning On

It used to be that the power would go out and we’d wake up to blinking LED displays. Now, apparently, we need to worry about appliances accidentally turning on during the night.

Ashley Carman, reporting for the Verge:

At least three smart June Ovens have turned on in the middle of the night and heated up to 400 degrees Fahrenheit or higher. The ovens’ owners aren’t sure why this happened, and June tells The Verge that user error is at fault. The company is planning an update that’ll hopefully remedy the situation and prevent it from happening again, but that change isn’t coming until next month.

One owner’s oven turned on around 2:30AM and broiled at 400 degrees Fahrenheit for hours while he slept, and he only noticed when he woke up four hours later. Nest cam footage captured the exact moment it turned on: the oven illuminates his dark, empty kitchen in a truly Black Mirror-like recording. This owner says his wife baked a pie around 11:30PM the night of the preheating incident, but she turned the oven off once she took the pie out.

June compares the “user error” to “butt dialing,” explaining that owners accidentally turned the ovens on with the app. But the company seems to acknowledge that the design of the oven and the app need to be improved to prevent these kinds of mistakes.

The article mentions that “unattended cooking” accounts for thirty-three percent of home fires. Smoking and candles are also up there. My law partner and I once tried a case where the house burned down and killed the elderly gentleman living there. Our forensic analysis showed the stove had been left on. Most of us would probably have to admit we’ve left the stove or oven on a few times.

But accidentally and remotely turning on our appliances — especially ones that get really hot — is definitely a new thing.

(Via Kottke)

State Bar Opinions On Safeguarding Client Data

AI Falsely Accused Thousands Of Fraud?

Civil rights attorney Jennifer L. Lord represents clients in wrongful termination cases. In 2013, she noticed a common story emerging:

“We were experiencing this rush of people calling us and saying they were told they committed fraud,” Lord said. “We spoke to some of our colleagues who also practice civil rights and employment law and everyone was experiencing this.”

Lord and her team then discovered the Michigan Unemployment Insurance Agency in 2013 purchased the algorithmic decision-making computer called MIDAS, and when the agency did so, it also laid off its fraud detection unit.

“The computer was left to operate by itself,” Lord said. “The Michigan auditor general issued two different reports a year later and found the computer was wrong 93 percent of the time.”

The government accused about 40,000 people of fraud and seized close to $100 million by garnishing their wages or seizing tax refunds. This led Lord and her team to file the class action lawsuit Bauserman v. Unemployment Insurance Agency (subscription required), and the Michigan Supreme Court recently ruled the case may proceed.

According to Lord, it took some digging just to learn that the decision was performed by an algorithm. Recognizing algorithmic — as opposed to human — decisionmaking, has become an increasingly important skill, one of Yuval Noah Hariri’s 21 Lessons for the 21st Century.

This also raises the question of whether machine can and should provide due process. In this case, fraud requires proving intent, involving the subtle and inferential determination of what people were thinking under the circumstances. Are algorithms currently up to the task? How soon before they are?

It also raises the question of how algorithms should be used to make law enforcement decisions more generally. With facial recognition and surveillance data increasingly available, using algorithms to make decisions becomes more attractive. Just ask Chinese authorities.

Finally, this reminds me of the situation in Baltimore, where a cash-strapped municipality finds itself overwhelmed with technological advancement. For Baltimore, it was the inability to fight back against malware. Here, the state may have been tempted by what seemed like a good technological substitute for an expensive government function.

For now at least, significant government decisions still need to be made by humans. As algorithms are incorporated into government decisions, the process must be disclosed and made as transparent as possible. And critically, there must be a way to appeal the decision to a human.