Check out the TB2K CHATROOM, open 24/7               Configuring Your Preferences for OPTIMAL Viewing
  To access our Email server, CLICK HERE

  If you are unfamiliar with the Guidelines for Posting on TB2K please read them.      ** LINKS PAGE **



*** Help Support TB2K ***
via mail, at TB2K Fund, P.O. Box 24, Coupland, TX, 78615
or


TECH Leaked Google Memo - with link to video
+ Reply to Thread
Results 1 to 7 of 7
  1. #1

    18 Leaked Google Memo - with link to video

    I could have sworn I read this on a thread here this morning but I can't find it either on advanced search or by looking at the first two pages (I did this three times, I still might have missed it - if so, mods please combine with the older thread).

    But this is so important, I think everyone should watch this apx 9-minute video which was supposed to say only within Google until someone leaked it.

    It is very well presented and produced and if you are not really listening can seem harmless enough, but when you put it all together the basic message is:

    Google can/should be used not only to collect ALL the date for EVERYONE in the world but that data should be stored forever, used to MANIPULATE all sorts of personal behaviors for some "greater goal or good" (eliminated poverty for example) and do so by making "suggestions" to "help" modify the personal behavior of individuals and groups to make these changes happen.

    Then with all sorts of pictures of cute babies and elders; the idea that this information and suggestions (aka social engineer) can then be used to further advance these trends in "the next generation" or "future generations".

    The soft and gentle word "Caretaker" is used to put a name on what Google would supposedly be doing with all this data (how kind and loving - not!).

    There was a bit on how "historical information" could be for suggesting customized products for people to buy as well as "suggesting" the "best" option at the supermarket (the one they use is to keep pointing out the local source of banana, very funny since the narrator has a British accent and we have so many bananas grown here in Ireland and the UK - NOT!).

    As I said I really tried to find the original thread and this can be combined with it if I just kept missing it.

    Here is the video link and I will attempt to embed, see it before it goes down the rathole...
    https://www.youtube.com/watch?v=gFdkooUxI04

    expatriate Californian living in rural Ireland with husband, dogs, horses. garden and many, many cats

  2. #2
    I just have actually seen this on Drudge, this is the article they linked to:

    https://www.theverge.com/2018/5/17/1...o-data-privacy
    GOOGLE’S SELFISH LEDGER IS AN UNSETTLING VISION OF SILICON VALLEY SOCIAL ENGINEERING
    This internal video from 2016 shows a Google concept for how total data collection could reshape society
    By Vlad Savov@vladsavov May 17, 2018, 8:00am EDT
    SHARE
    Google has built a multibillion-dollar business out of knowing everything about its users. Now, a video produced within Google and obtained by The Verge offers a stunningly ambitious and unsettling look at how some at the company envision using that information in the future.

    The video was made in late 2016 by Nick Foster, the head of design at X (formerly Google X) and a co-founder of the Near Future Laboratory. The video, shared internally within Google, imagines a future of total data collection, where Google helps nudge users into alignment with their goals, custom-prints personalized devices to collect more data, and even guides the behavior of entire populations to solve global problems like poverty and disease.

    When reached for comment on the video, an X spokesperson provided the following statement to The Verge:

    “We understand if this is disturbing -- it is designed to be. This is a thought-experiment by the Design team from years ago that uses a technique known as ‘speculative design’ to explore uncomfortable ideas and concepts in order to provoke discussion and debate. It’s not related to any current or future products.”

    google selfish ledger
    All the data collected by your devices, the so-called ledger, is presented as a bundle of information that can be passed on to other users for the betterment of society.
    Titled The Selfish Ledger, the 9-minute film starts off with a history of Lamarckian epigenetics, which are broadly concerned with the passing on of traits acquired during an organism’s lifetime. Narrating the video, Foster acknowledges that the theory may have been discredited when it comes to genetics but says it provides a useful metaphor for user data. (The title is an homage to Richard Dawkins’ 1976 book The Selfish Gene.) The way we use our phones creates “a constantly evolving representation of who we are,” which Foster terms a “ledger,” positing that these data profiles could be built up, used to modify behaviors, and transferred from one user to another:

    “User-centered design principles have dominated the world of computing for many decades, but what if we looked at things a little differently? What if the ledger could be given a volition or purpose rather than simply acting as a historical reference? What if we focused on creating a richer ledger by introducing more sources of information? What if we thought of ourselves not as the owners of this information, but as custodians, transient carriers, or caretakers?”

    The so-called ledger of our device use — the data on our “actions, decisions, preferences, movement, and relationships” — is something that could conceivably be passed on to other users much as genetic information is passed on through the generations, Foster says.


    Resolutions by Google, the concept for a system-wide setting that lets users pick a broad goal and then directs their everyday actions toward it.
    Building on the ledger idea, the middle section of the video presents a conceptual Resolutions by Google system, in which Google prompts users to select a life goal and then guides them toward it in every interaction they have with their phone. The examples, which would “reflect Google’s values as an organization,” include urging you to try a more environmentally friendly option when hailing an Uber or directing you to buy locally grown produce from Safeway.


    An example of a Google Resolution superimposing itself atop a grocery store’s shopping app, suggesting a choice that aligns with the user’s expressed goal.
    Of course, the concept is premised on Google having access to a huge amount of user data and decisions. Privacy concerns or potential negative externalities are never mentioned in the video. The ledger’s demand for ever more data might be the most unnerving aspect of the presentation.

    Foster envisions a future where “the notion of a goal-driven ledger becomes more palatable” and “suggestions may be converted not by the user but by the ledger itself.” This is where the Black Mirror undertones come to the fore, with the ledger actively seeking to fill gaps in its knowledge and even selecting data-harvesting products to buy that it thinks may appeal to the user. The example given in the video is a bathroom scale because the ledger doesn’t yet know how much its user weighs. The video then takes a further turn toward anxiety-inducing sci-fi, imagining that the ledger may become so astute as to propose and 3D-print its own designs. Welcome home, Dave, I built you a scale.


    A conceptual cloud processing node that is analyzing user information and determining the absence of a relevant data point; in this case, user weight.
    Foster’s vision of the ledger goes beyond a tool for self-improvement. The system would be able to “plug gaps in its knowledge and refine its model of human behavior” — not just your particular behavior or mine, but that of the entire human species. “By thinking of user data as multigenerational,” explains Foster, “it becomes possible for emerging users to benefit from the preceding generation’s behaviors and decisions.” Foster imagines mining the database of human behavior for patterns, “sequencing” it like the human genome, and making “increasingly accurate predictions about decisions and future behaviours.”

    “As cycles of collection and comparison extend,” concludes Foster, “it may be possible to develop a species-level understanding of complex issues such as depression, health, and poverty.”


    A central tenet of the ledger is the accumulation of as much data as possible, with the hope that at some point, it will yield insights about major global problems.
    Granted, Foster’s job is to lead design at X, Google’s “moonshot factory” with inherently futuristic goals, and the ledger concept borders on science fiction — but it aligns almost perfectly with attitudes expressed in Google’s existing products. Google Photos already presumes to know what you’ll consider life highlights, proposing entire albums on the basis of its AI interpretations. Google Maps and the Google Assistant both make suggestions based on information they have about your usual location and habits. The trend with all of these services has been toward greater inquisitiveness and assertiveness on Google’s part. Even email compositions are being automated in Gmail.

    At a time when the ethics of new technology and AI are entering the broader public discourse, Google continues to be caught unawares by the potential ethical implications and downsides of its products, as seen most recently with its demonstration of the Duplex voice-calling AI at I/O. The outcry over Duplex’s potential to deceive prompted Google to add the promise that its AI will always self-identify as such when calling unsuspecting service workers.

    The Selfish Ledger positions Google as the solver of the world’s most intractable problems, fueled by a distressingly intimate degree of personal information from every user and an ease with guiding the behavior of entire populations. There’s nothing to suggest that this is anything more than a thought exercise inside Google, initiated by an influential executive. But it does provide an illuminating insight into the types of conversations going on within the company that is already the world’s most prolific personal data collector.

    Update: Nick Foster’s title has been updated to include the Near Future Laboratory and X’s response has been moved.
    expatriate Californian living in rural Ireland with husband, dogs, horses. garden and many, many cats

  3. #3
    Join Date
    May 2001
    Location
    Cow Hampshire
    Posts
    16,857
    This is happening now.

    Owner has a "flip-phone." No data no nothings except telephone calls - and a clock.

    Owner's Wife has an Android "Smart Phone." She has it set to get periodic updates on news and events. She'll come into the barn to clean or curry, and at least twice during her visit she will get an "alert."

    I have noticed that when these alerts mention Trump - they invariably show our President in a bad light. No mention EVER of his successes, the North Korean capitulations, the improving business climate, the rise in the stock market, or any of the advantages of "Making America Great Again."

    This questions in my mind - who exactly decides WHAT merits an "Alert" - and what do they wish to accomplish by providing these alerts?

    And why couldn't a different source have a different set of values to determine what constitutes "alert?"

    She has signed onto Their Values to become part of her values.

    Of course when there were only three networks (data streams) of information into the house, alerts were more focused - should one network show an alert then ALL show an alert. This caused by the VAST amount of information possible and the limited time to transfer it.

    What has changed since then is the ACCESS of notable news to you. Every human is now connected to an endless stream of information. Time is no longer the constraint to information passage. And with unlimited access time comes the luxury of decision (for them) regarding what is significant and what is not. And what becomes significant because of its "political significance" and what is ignored - or buried.

    Information can be what liberates you - or the lack thereof can be "what keeps you down on the farm."

    THEY will decide what you need to know.

    YOU, like me, are kept in your own little "Informational Stable" and fed what morsels they will let slip from their table onto yours.

    Welcome to my life!

    Dobbin
    I hinnire propter hoc ecce ego

  4. #4
    Join Date
    May 2001
    Location
    NC
    Posts
    3,306
    Way to much data for us mortals to process - will become (probably already is) food for the AI

  5. #5
    Join Date
    Mar 2006
    Posts
    12,071
    Mommie Facebook, Daddy Google?
    The real art of conversation is not only to say the right thing at the right time, but also to leave unsaid the wrong thing at the tempting moment.

    Worrying does not take away tomorrow's troubles, it takes away today's peace .

  6. #6
    Join Date
    Jul 2008
    Location
    Here- sometimes there
    Posts
    1,508
    Google can/should be used not only to collect ALL the date for EVERYONE in the world but that data should be stored forever, used to MANIPULATE all sorts of personal behaviors for some "greater goal or good" (eliminated poverty for example) and do so by making "suggestions" to "help" modify the personal behavior of individuals and groups to make these changes happen.
    We all "know" that were subjected to the propagandizing/social engineering efforts. What I found surprising is what I read a year or two back, which was a report suggesting even though we are
    indeed aware of these efforts, that awareness doesn't keep us from subcoming. I'm of the opinion that there's no way to unring the bell, and we're forever stuck with ever greater, more thorough and intrusive efforts to "know us" so that they might manipulate us into whatever performance metric they're trying to bring about.

    All we can do is thoroughly educate ourselves and others on critical thinking, debate, reasoning etc and hope it's enough push back.

    What is tricky is that society bringing its collective pressure to bear is one of our better methods of socializing people into behaving as
    law abiding, productive community members. But, who can we trust to decide those goals & values. Not me. Not you. And sure as HELL not GOOGLE!
    We didn't elect Trump for his decorum; We elected him for our survival!

  7. #7
    That video was creepy, the music was creepy, the images as used were used creepily. The whole thing smelled like a commercial for AI. I had been meaning to post a link to this video for a while, this is the perfect thread to add it to.

    https://www.youtube.com/watch?v=43GuZP5PYeg


    DO YOU TRUST THIS COMPUTER
    1 hr 18 min
    Documentary on the AI, on its possible and noble applications but also on the fears of a possible singularity that could bring the human race to its knees.

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts


NOTICE: Timebomb2000 is an Internet forum for discussion of world events and personal disaster preparation. Membership is by request only. The opinions posted do not necessarily represent those of TB2K Incorporated (the owner of this website), the staff or site host. Responsibility for the content of all posts rests solely with the Member making them. Neither TB2K Inc, the Staff nor the site host shall be liable for any content.

All original member content posted on this forum becomes the property of TB2K Inc. for archival and display purposes on the Timebomb2000 website venue. Said content may be removed or edited at staff discretion. The original authors retain all rights to their material outside of the Timebomb2000.com website venue. Publication of any original material from Timebomb2000.com on other websites or venues without permission from TB2K Inc. or the original author is expressly forbidden.



"Timebomb2000", "TB2K" and "Watching the World Tick Away" are Service Mark℠ TB2K, Inc. All Rights Reserved.