What is RAM?People always say you should upgrade your RAM.
But what is it, anyways?
Random Access Memory, or RAM (pronounced as ramm), is the physical hardware inside a computer that temporarily stores data, serving as the computer's "working" memory.
Additional RAM allows a computer to work with more information at the same time, which usually has a dramatic effect on total system performance.
The purpose of RAM is to provide quick read and write access to a storage device. Your computer uses RAM to load data because it's much quicker than running that same data directly off of a hard drive.
Think of RAM like an office desk. A desk is used for quick access to important documents, writing tools, and other items that you need right now. Without a desk, you'd keep everything stored in drawers and filing cabinets, meaning it would take much longer to do your everyday tasks since you would have to constantly reach into these storage compartments to get what you need, and then spend additional time putting them away.
Similarly, all the data you're actively using on your computer (or smartphone, tablet, etc.) is temporarily stored in RAM. This type of memory, like a desk in the analogy, provides much faster read/write times than using a hard drive. Most hard drives are considerably slower than RAM due to physical limitations like rotation speed.
How Much RAM Do You Need?
Just like with a CPU and hard drive, the amount of memory you need for your computer depends entirely on what you use, or plan to use, your computer for.
For example, if you're buying a computer for heavy gaming, then you'll want enough RAM to support smooth gameplay. Having just 2 GB of RAM available for a game that recommends at least 4 GB is going to result in very slow performance if not total inability to play your games.
On the other end of the spectrum, if you use your computer for light internet browsing and no video streaming, games, memory-intensive applications, etc., you could easily get away with less memory.
The same goes for video editing applications, programs that are heavy on 3D graphics, etc. You can normally find out before you buy a computer just how much RAM a specific program or game will require, often listed in a "system requirements" area of the website or product box.
It would be hard to find a new desktop, laptop, or even tablet that comes with less than 2 to 4 GB of RAM pre-installed. Unless you have a specific purpose for your computer apart from regular video streaming, internet browsing, and normal application use, you probably don't need to buy a computer that has any more RAM than that.
Do you ever checked and knows your Blood Pressure?
Have you ever used the device which measure you blood pressure?
Do you ever thought of how it works? Lets Explore!
Our heart is an amazing pump!! WHY IS THAT??? It works reliably for decades, and it safely pumps blood which is one of the trickest liquids around. Lets put it in simple words, our blood vessels are PIPES. They take the output from the pump and distribute it throughout the body.
THEREFORE, A blood pressure gauge is simply a way to measure the performance of the pump and the pipes.
There are two numbers in a blood pressure reading: systolic and diastolic. For example, a typical reading might be 120/80. When the doctor puts the cuff around your arm and pumps it up, what he/she is doing is cutting off the blood flow with the pressure exerted by the cuff. As the pressure in the cuff is released, blood starts flowing again and the doctor can hear the flow in the stethoscope. The number at which blood starts flowing (120) is the measure of the maximum output pressure of the heart (systolic reading). The doctor continues releasing the pressure on the cuff and listens until there is no sound. That number (80) indicates the pressure in the system when the heart is relaxed (diastolic reading).
If the numbers are too high, such caused by ;
the heart is having to work too hard because of restrictions in the pipes.
Certain hormones, like adrenaline (which is released when you are under stress) cause certain blood vessels to constrict, and this raises your blood pressure
if you are under constant stress, your blood pressure goes up, and it means that your heart has to work too hard.
Other things that can increase the blood pressure include deposits in the pipes and a loss of elasticity as the blood vessels age.
High blood pressure can cause the heart to fail (from working too hard), or it can cause kidney failure (from too much pressure)
Automatic Cuffs The Tech replaced the Manual gauge that has been used for ages!
Automatic blood pressure cuffs run on either electricity or battery power and have a digital screen that displays the blood pressure reading. Automatic cuffs work on a the same principle as the manual cuffs -- they inflate to cut off blood flow, then slowly release and register the points at which the pulse starts and stops. Once the test is complete, the unit displays both the systolic and diastolic number on the screen. Some automatic cuffs also register pulse rate and have a fail-safe that warns if the cuff is not in place. An automatic cuff is much easier to use but may also provide slightly different readings.
In contrast with manual cuff gauge, does this device measured accurately??Share of us some of your experience using one! 😼
Information technology (IT) is the application of computers and telecommunications equipment to store, retrieve, transmit and manipulate data, often in the context of a business or other enterprise. Today information technology is used in wide range of fields and one of the upcoming fields is of Medical Science, which is known as Health Information Technology (HIT).
Health information technology (HIT) is the application of information processing involving both computer hardware and software that deals with the storage, retrieval, sharing, and use of health care information, data, and knowledge for communication and decision making. HIT, technology represents computers and communications attributes that can be networked to build systems for moving health information.
LET'S LEARN A LIL BIT OF HISTORY SHALL WE???
1949 - Gustav Wagner established the first professional organization for health informatics in Germany.
1950s - The rise of the computers. Worldwide use of computer technology in medicine began in this era.
Health informatics also called Health Information Systems is a discipline at the intersection of information science, computer science, and health care. It concerns with the resources, devices, and methods required for optimizing the acquisition, storage, retrieval, and use of information in health and biomedicine.
Health informatics tools include computers, clinical guidelines, formal medical terminologies, and information and communication systems. It is applied to the areas of nursing, clinical care, dentistry, pharmacy, public health, occupational therapy, and (bio)medical research.
The SunDaily Article on Pharmacy Information Management System
There is no argument over the influence of IT in medicine and education. But there are still many areas which need to be improved before we could utilise IT to its full extent. Last but not the least, however advanced the technology gets, it can never replace the interaction the doctors and students require with the patient and the clinical judgments which make great doctors. So, in the pursuit of modern technologies, we should be careful that the doctor patient relationships do not get overlooked.
It is a two-dimensional square barcode which can store encoded data.
Most of the time the data is a link to a website (URL).
Introduction
In this era of digital, QR Codes are best seen on advertisement, flyers, posters, magazines, and others that can easily spot these two-dimensional barcodes around you. QR Codes let you interact with the world using your smartphone.
Specifically, a QR Code extends the data at disposal on any physical object and create a digital extent to marketing operations. This technology enables and speeds up the use of mobile web services: it is a very creative digital tool.
Interactive actions
When scanning a QR Code using smartphone, you get an immediate access to its content. The QR Code reader can then carry an action, like opening your web browser to a specific URL. Other actions can be triggered, like storing a business card in your smartphone's contact list or connecting to a wireless network.
History
QR Codes were created in 1994 by Denso Wave, a Japanese subsidiary in the Toyota Group. The use of this technology is now free. The QR Code is not the only two-dimensional barcode in market, another example is the Data Matrix code.
QR Code is the most famous 2D barcode in the world. It has gained its success in Japan since the 2000s where he is now a standard. In 2011, an average of 5 QR Codes were scanned daily by each Japanese - more than the average number of SMS sent!
In 2010 QR Codes started to expand in the USA then in Europe where they can notably be seen in advertisements.
“Everything we love about civilization is a product of intelligence, so amplifying our human intelligence with artificial intelligence has the potential of helping civilization flourish like never before – as long as we manage to keep the technology beneficial.“
Max Tegmark, President of the Future of Life Institute
What is exactly AI in an easy way to understand its existence???
THE ANSWER?
Commonly AI or Artificial Intelligence is a machine or a computer program that learnt how to do tasks that required forms of INTELLIGENCES that are usually done by human. And to take it another way, intelligence comes in many different aspects. We have many type of AI's that are good in particular subsets of intelligence.
Below are the examples of mentioned ;
Starting from the creation of AI's like SIRI to self-driving cars, artificial intelligence (AI) is progressing rapidly. While science fiction often portrays AI as robots with human-like characteristics, AI can encompass anything from Google’s search algorithms to autonomous weapons.
Artificial intelligence today is properly known as narrow AI (or weak AI), in that it is designed to perform a narrow task (e.g. only facial recognition or only internet searches or only driving a car). However, the long-term goal of many researchers is to create general AI (AGI or strong AI). While narrow AI may outperform humans at whatever its specific task is, like playing chess or solving equations, AGI would outperform humans at nearly every cognitive task.
Artificial Intelligence History
The term artificial intelligence was coined in 1956, but AI has become more popular today thanks to increased data volumes, advanced algorithms, and improvements in computing power and storage.
Early AI research in the 1950s explored topics like problem solving and symbolic methods. In the 1960s, the US Department of Defense took interest in this type of work and began training computers to mimic basic human reasoning. For example, the Defense Advanced Research Projects Agency (DARPA) completed street mapping projects in the 1970s. And DARPA produced intelligent personal assistants in 2003, long before Siri, Alexa or Cortana were household names.
This early work paved the way for the automation and formal reasoning that we see in computers today, including decision support systems and smart search systems that can be designed to complement and augment human abilities.
While Hollywood movies and science fiction novels depict AI as human-like robots that take over the world, the current evolution of AI technologies isn’t that scary – or quite that smart. Instead, AI has evolved to provide many specific benefits in every industry. Keep reading for modern examples of artificial intelligence in health care, retail and more.
1950s–1970s : Neural Networks
Early work with neural networks stirs excitement for “thinking machines.”
1980s–2010s : Machine learning
Machine learning becomes popular.
Present Day : Deep Learning
Deep learning breakthroughs drive AI boom.
The Advantages for Artificial Intelligence (AI)
Jobs - depending on the level and type of intelligence these machines receive in the future, it will obviously have an effect on the type of work they can do, and how well they can do it (they can become more efficient). As the level of AI increases so will their competency to deal with difficult, complex even dangerous tasks that are currently done by humans, a form of applied artificial intelligence.
Increase Our Technological Growth Rate - following on from the point above, AI will potentially help us 'open doors' into new and more advanced technological breakthroughs. For instance, due to their ability to produce millions and millions of computer modelling programs also with high degrees of accuracy, machines could essentially help us to find and understand new chemical elements and compounds etc. Basically, a very realistic advantage AI could propose is to act as a sort of catalyst for further technological & scientific discovery.
They don't stop - as they are machines there is no need for sleep, they don't get ill , there is no need for breaks or Facebook, they are able to go, go, go! There obviously may be the need for them to be charged or refueled, however the point is, they are definitely going to get a lot more work done than we can. All that is required is that they have some energy source.
No risk of harm - when we are exploring new undiscovered land or even planets, when a machine gets broken or dies, there is no harm done as they don't feel, they don't have emotions. Where as going on the same type of expeditions a machine does, may simply not be possible or they are exposing themselves to high risk situations.
Act as aids - they can act as 24/7 aids to children with disabilities or the elderly, they could even act as a source for learning and teaching. They could even be part of security alerting you to possible fires that you are in threat of, or fending off crime.
Their function is almost limitless - as the machines will be able to do everything (but just better) essentially their use, pretty much doesn't have any boundaries. They will make fewer mistakes, they are emotionless, they are more efficient, they are basically giving us more free time to do as we please.
The Disadvantages for Artificial Intelligence (AI)
Over reliance on AI - as you may have seen in many films such as The Matrix, iRobot or even kids films such as WALL.E, if we rely on machines to do almost everything for us -- we have become so dependent, that if they were to simply shut down (or even decide they want to give up this working gig) they have the potential to ruin our economy and effectively our lives. Although the films are essentially just fiction, they still present a real possibility if we become too heavily dependent on machines. It wouldn't be too smart on our part not to have some sort of back up plan to potential issues that could arise, if the machines 'got real smart'.
Human Feel - as they are are machines they obviously can't provide you with that 'human touch and quality', the feeling of a togetherness and emotional understanding, that machines will lack the ability to sympathise and empathise with your situations, and may act irrationally as a consequence.
Inferior - as machines will be able to perform almost every task better than us in practically all respects, they will take up many of our jobs, which will then result in masses of people who are then jobless and as a result feel essentially useless. This could then lead us to issues of mental illness and obesity problems etc.
Misuse - there is no doubt that this level of technology in the wrong hands can cause mass destruction, where robot armies could be formed, or they could perhaps malfunction or be corrupted which then we could be facing a similar scene to that of terminator ( hey, you never know).
Ethically Wrong? - People say that the gift of intuition and intelligence was God's gift to mankind, and so to replicate that would be then to kind of 'play God'. Therefore not right to even attempt to clone our intelligence.
HOW CAN AI BE DANGEROUS?
Most researchers agree that a superintelligent AI is unlikely to exhibit human emotions like love or hate, . When considering how AI might become a risk, experts think two scenarios most likely:
The AI is programmed to do something devastating: Autonomous weapons are artificial intelligence systems that are programmed to kill. In the hands of the wrong person, these weapons could easily cause mass casualties. Moreover, an AI arms race could lead to an AI war that also results in mass casualties. To avoid being misused by the enemy, these weapons would be designed to be extremely difficult to simply “turn off,” so humans could plausibly lose control of such a situation. This risk is one that’s present even with narrow AI, but grows as levels of AI intelligence and autonomy increase.
The AI is programmed to do something beneficial, but it develops a destructive method for achieving its goal: This can happen whenever we fail to fully align the AI’s goals with ours, which is strikingly difficult. If you ask an obedient intelligent car to take you to the airport as fast as possible, it might get you there chased by helicopters and covered in vomit, doing not what you wanted but literally what you asked for. If a superintelligent system is tasked with a ambitious geoengineering project, it might wreak havoc with our ecosystem as a side effect, and view human attempts to stop it as a threat to be met.
Do you guys ever think of it, or just think of it when reading this post?
Let Us Explore together the greatness of technology advancing rapidly nowadays!!
Facial recognition certainly is not new tech obviously. In this post, I will give an overview of the most famous examples of how it's currently used.
Year of 2018, led the mobile industry to follow new trends. For example we have seen bezels on smartphones shrink and almost disappear and witnessed the increasingly advanced use of artificial intelligence along with chipsets upgraded.
But how do these facial recognition technologies work? There are different variations and today we’ll see how they work according to their different characteristics.
LG Face Recognition
LG uses the simples and most effective of FaceRecognition. The dedicated software records image of user through front camera. It acquires biometric data, it asks the person to nod and rotate their head LEFT and RIGHT to capture as much data as possible.
1) Just pull your smartphone out your pocket of lift it up from table and place it in front of your face to unlock the device. 2) You don’t have to press the release button and no interaction with the display is required. 3) You can hide the content of your notifications until the device recognizes your face, but you can’t watch notifications come in from the lock screen. This feature is available on the LG G6, Q6 and V30 smartphones.
Honor Face Unlock
Huawei’s sister company in China introduced face recognition for the first time on the Honor View 10 then recently introduced to the Honor 7X through an update.
At first the technology was only used to show the contents of notifications in the lock screen (that were otherwise hidden), but didn’t actually unlock the smartphone. A PIN or fingerprint scan was still needed to unlock the device.NOTE : Huawei aims to beat Apple with its own face recognition tech
1) The latest software update the recognition technology has finally become useful because after recognizing the smartphone’s owner.
2) The technology allows you to unlock the device directly or through a swipe, without a pin or password.
3) You can choose to turn on the display and start recognition automatically as soon as you lift up your smartphone and point it towards your face, similarly to what you do for LG or Apple.
OnePlus Face Unlock
OnePlus introduced unlocking via facial recognition on the OnePlus 5T and then made it available on its predecessor models, the OnePlus 5 and 3/3T. OnePlus’s procedure is also simple, as the appearance of your face is detected via the front camera and the scan is recorded offline on your smartphone.
Each time you unlock your smartphone, it uses the front camera for recognition, comparing the original scan to what the camera sees.
1) Unlocking usually occurs immediately when the power button (or through a double tap on the off display) is pressed quickly and accurately, but you can change the settings so that after detecting the user’s face it will require a swipe on the screen to unlock.
2) This is useful if you just want to read notifications without unlocking the device.
3) There is also a function that illuminates the screen to allow for better facial recognition in low lighting conditions.
Samsung Face Recognition (and iris scanner)
Samsung uses facial recognition technology that is very simple and similar to Android stock software.
1) During setup, your smartphone will ask you to be at a distance of about 20 to 50 cm from your device so that it can scan your face.
2) Unlocking is fact enough, but requires that you press the unlock key to turn on the display and start the recognition procedure.
3) The secret inner workings of the iris scanner
As an alternative to this insecure unlocking method (a photo or video can deceive the system) your smartphone allows you to also use the iris scanner.
Samsung is the only manufacturer that currently uses this technology on its smartphones, but I personally find it slow and unreliable, despite the fact that according to company it’s much more reliable than reading your fingerprint. It’s also really difficult to position your device at the right distance and angle for the scanner to work. The function is available on the Galaxy S8, S8+ and Note 8.
Apple Face ID
The iPhone X is the only smartphone from the Californian company that has this feature. The smartphone, designed to celebrate the tenth anniversary of the device that has changed the world, features an array of sensors located in the upper part of the display inside the so-called notch.
1) Important sensors required for the facial recognition are the IR beam projector and the infrared camera.
2) Apple’s IR emitter, the iPhone X is able to project thousands of infrared rays toward the user that allow the camera to create a 3D model of your face by measuring the distance traveled by each beam before they hit your face.
Facial recognition: Android chases after Apple
3) The scan is really precise and effective and mainly takes into account facial features and the relative position of your eyes and nose. This choice has proven to be successful in everyday use, as the iPhone X can even recognize its owner after a change in hairstyle, facial hair (especially relevant for me) or if you’re wearing a hat, headphones, or a scarf.
Furthermore, it makes this type of facial recognition the safest ever seen on a smartphone thanks to the precise information detected through 3D scanning.
The smartphone’s display can be turned on independently thanks to Rise to Wake and recognizes the user’s face in a fraction of a second, even in the dark. The facial recognition only serves to reveal the content of notifications (otherwise hidden) and unlock your smartphone, meaning that the iPhone understands it’s in the hands of its owner. The actual unlocking to reach the home screen is done with a swipe from below.