{"id":6230,"date":"2017-10-31T01:20:32","date_gmt":"2017-10-30T17:20:32","guid":{"rendered":"https:\/\/www.curtin.edu.au\/news\/computers-make-better-bank-managers-humans\/"},"modified":"2022-12-07T13:08:07","modified_gmt":"2022-12-07T05:08:07","slug":"computers-make-better-bank-managers-humans","status":"publish","type":"post","link":"https:\/\/www.curtin.edu.au\/news\/computers-make-better-bank-managers-humans\/","title":{"rendered":"Do computers make better bank managers than humans?"},"content":{"rendered":"<p><em>You may have heard that algorithms will take over the world. But how are they operating right now? We take a look in our series on <a href=\"https:\/\/theconversation.com\/au\/topics\/algorithms-at-work-44799\">Algorithms at Work<\/a>.<\/em><\/p>\n<p>Algorithms are increasingly making decisions that affect ordinary people\u2019s lives. One example of this is so-called \u201calgorithmic lending\u201d, with some companies <a href=\"http:\/\/www.afr.com\/business\/banking-and-finance\/big-four-banks-do-deals-to-fight-mortgage-disrupters-20171011-gyz7wa\">claiming<\/a> to have reduced the time it takes to approve a home loan to mere minutes.<\/p>\n<p>But can computers become better judges of financial risk than human bank tellers? Some computer scientists and data analysts <a href=\"http:\/\/www.afr.com\/business\/banking-and-finance\/financial-services\/credit-rating-agency-questions-the-rise-of-lending-algorithms-20161122-gsuwjy\">certainly think so<\/a>.<\/p>\n<h2>How banking is changing<\/h2>\n<p>On the face of it, bank lending is rather simple.<\/p>\n<p>People with excess money deposit it in a bank, expecting to earn interest. People who need cash borrow funds from the bank, promising to pay the amount borrowed plus interest. The bank makes money by charging a higher interest rate to the borrower than it pays the depositor.<\/p>\n<p>Where it gets a bit trickier is in managing risk. If the borrower were to default on payments, not only does the bank not earn the interest income, it also loses the amount loaned (provided there wasn\u2019t collateral attached, such as a house or car).<\/p>\n<p>A borrower who is deemed less creditworthy is charged a higher interest rate, thereby compensating the bank for additional risk.<\/p>\n<hr \/>\n<p><em><strong>Read more:<br \/>\n<a href=\"http:\/\/theconversation.com\/how-marketers-use-algorithms-to-try-to-read-your-mind-84682\">How marketers use algorithms to (try to) read your mind<\/a><\/strong><br \/>\n<\/em><\/p>\n<hr \/>\n<p>Consequently, the banks have a delicate balancing act &#8211; they always want more borrowers to increase their income, but they need to screen out those who aren\u2019t creditworthy.<\/p>\n<p>Traditionally this role was fulfilled by an experienced credit manager \u2014 a judge of human character \u2014 who could distinguish between responsible borrowers and those who would be unlikely to meet their repayment schedules.<\/p>\n<h2>Are humans any good at judging risk?<\/h2>\n<p>When you look at the research, it doesn\u2019t seem that humans are that great at judging financial risk.<\/p>\n<p>Two psychologists <a href=\"http:\/\/journals.sagepub.com\/doi\/abs\/10.1518\/155534307X232857\">conducted<\/a> an experimental study to assess the kind of information that loan officers rely upon. They found that in addition to \u201chard\u201d financial data, loan officers rely on \u201csoft\u201d gut instincts. The latter was even regarded as a more valid indicator of creditworthiness than financial data.<\/p>\n<p><a href=\"http:\/\/amj.aom.org\/content\/40\/5\/1063.short\">Additional studies<\/a> of loan officers in controlled experiments showed that the longer the bank\u2019s association with the customer, the larger the requested loan, and the more exciting its associated industry, the more likely are loan officers to underrate loan risks.<\/p>\n<p><a href=\"https:\/\/www.researchgate.net\/publication\/227445655_The_Effects_of_Task_Size_and_Similarity_on_the_Decision_Behavior_of_Bank_Loan_Officers\">Other researchers<\/a> have found that the more applications that loan officers have to process, the greater the likelihood that bank officers will use non-compensatory (irrational) decision strategies. For example, just because a customer has a high income that doesn\u2019t mean they don\u2019t have a bad credit history.<\/p>\n<p>Loan officers have also <a href=\"https:\/\/deepblue.lib.umich.edu\/handle\/2027.42\/28159\">been found<\/a> to reach decisions early in the lending process, tending to ignore information that is inconsistent with their early impressions. Lastly, loan officers <a href=\"https:\/\/www.researchgate.net\/publication\/247874304_The_Effect_of_Auditor_Attestation_and_Tolerance_for_Ambiguity_on_Commercial_Lending_Decisions\">often fail<\/a> to properly weigh the credibility of financial information when evaluating commercial loans.<\/p>\n<h2>Enter algorithmic lending<\/h2>\n<p>Compared with human bank managers, a computer algorithm is like a devoted apprentice who painstakingly observes each person\u2019s credit history over many years.<\/p>\n<p>Banks already have troves of data on historical loan applications paired with outcomes &#8211; whether the loan was repaid or defaulted. Armed with this information, an algorithm can screen each new credit application to determine its creditworthiness.<\/p>\n<p>There are various methods, based on the specific data in each applicant\u2019s profile, from which the algorithm identifies the most relevant and unique attributes.<\/p>\n<hr \/>\n<p><em><strong>Read more:<br \/>\n<a href=\"http:\/\/theconversation.com\/algorithms-might-be-everywhere-but-like-us-theyre-deeply-flawed-66838\">Algorithms might be everywhere, but like us, they&#8217;re deeply flawed<\/a><\/strong><br \/>\n<\/em><\/p>\n<hr \/>\n<p>For example, if the application is filled in by hand and scanned into the computer, the algorithm may consider whether the application was written in block capitals or in cursive handwriting.<\/p>\n<p>The algorithm may have detected a pattern that applicants who write in all-caps without punctuation are usually less educated with a lower earning potential, and thereby inherently more risky. Who knew that how you write your name and address could result in <a href=\"https:\/\/www.bloomberg.com\/research\/stocks\/private\/snapshot.asp?privcapid=253231307\">denial of a credit application<\/a>?<\/p>\n<p>On the other hand, a degree from Harvard University could be viewed <a href=\"https:\/\/www.americanbanker.com\/news\/is-it-ok-for-lending-algorithms-to-favor-ivy-league-schools\">favourably<\/a> by algorithms.<\/p>\n<h2>On balance, computers come out ahead<\/h2>\n<p>A large part of human decision making is based on the first few seconds and how much they like the applicant. A well-dressed, well-groomed young individual has more chance than an unshaven, dishevelled bloke of obtaining a loan from a human credit checker. But an algorithm is unlikely to make the same kind of judgement.<\/p>\n<p><a href=\"http:\/\/scholarworks.law.ubalt.edu\/cgi\/viewcontent.cgi?article=1307&amp;context=all_fac\">Some critics<\/a> contend that algorithmic lending will shut disadvantaged people out of the financial system, because of the use of pattern-matching and financial histories. They argue that machines are by definition neutral and thus usual banking rules will not apply. This is a misconception.<\/p>\n<p>The computer program is constrained by the same regulations as the human underwriter. For example, the computer program cannot deny applications from a particular postal code, as those are usually segregated by income levels and racial ethnicity.<\/p>\n<p>Moreover, such overt or covert discrimination can be prevented by requiring lending agencies (and algorithms) to provide reasons why a particular application was denied, as <a href=\"https:\/\/www.oaic.gov.au\/privacy\/the-privacy-act\/credit-reporting\/\">Australia has done<\/a>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/counter.theconversation.com\/content\/85086\/count.gif?distributor=republish-lightbox-basic\" alt=\"The Conversation\" width=\"1\" height=\"1\" \/>In conclusion, computers make lending decisions based on objective data and avoid the biases exhibited by people, while complying with regulations that govern fair lending practices.<\/p>\n<p>&nbsp;<\/p>\n<p>This article was originally published on <a href=\"http:\/\/theconversation.com\">The Conversation<\/a>. Read the <a href=\"https:\/\/theconversation.com\/do-computers-make-better-bank-managers-than-humans-85086\">original article<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Can computers become better judges of financial risk than human bank tellers? Some computer scientists and data analysts certainly think so.<\/p>\n","protected":false},"author":4275,"featured_media":6231,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_oasis_is_in_workflow":0,"_oasis_original":0,"_oasis_task_priority":"","_relevanssi_hide_post":"","_relevanssi_hide_content":"","_relevanssi_pin_for_all":"","_relevanssi_pin_keywords":"","_relevanssi_unpin_keywords":"","_relevanssi_related_keywords":"","_relevanssi_related_include_ids":"","_relevanssi_related_exclude_ids":"","_relevanssi_related_no_append":"","_relevanssi_related_not_related":"","_relevanssi_related_posts":"","_relevanssi_noindex_reason":"","wds_primary_category":0,"wds_primary_research-areas":0,"footnotes":""},"categories":[4],"tags":[],"research-areas":[],"class_list":["post-6230","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research"],"acf":{"post_options":{"":null,"additional_content":{"title":"","content":"","image":false},"related_courses":false,"credits":{"author":{"title":"Saurav Dutta","url":"#","target":""},"photographer":"","media":false},"display_author":true,"banner":{"image":false}}},"featured_image":"https:\/\/www.curtin.edu.au\/news\/wp-content\/uploads\/2022\/07\/shutterstock_283575653-2.jpg","author_meta":{"first_name":"Curtin","last_name":"University","display_name":"Curtin University"},"publishpress_future_action":{"enabled":false,"date":"2026-04-14 14:09:45","action":"change-status","newStatus":"draft","terms":[],"taxonomy":"category","extraData":[]},"publishpress_future_workflow_manual_trigger":{"enabledWorkflows":[]},"_links":{"self":[{"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/posts\/6230","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/users\/4275"}],"replies":[{"embeddable":true,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/comments?post=6230"}],"version-history":[{"count":0,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/posts\/6230\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/media\/6231"}],"wp:attachment":[{"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/media?parent=6230"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/categories?post=6230"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/tags?post=6230"},{"taxonomy":"research-areas","embeddable":true,"href":"https:\/\/www.curtin.edu.au\/news\/wp-json\/wp\/v2\/research-areas?post=6230"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}