Free Technology eBooks
Computer technology combines the hardware of computers and computer-controlled devices with software—operating systems, authoring tools, expert systems, and courseware—to support training technology.
The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called “Universal Computing machine” and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing’s design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.
1. Adobe Premiere Pro by Ben Goldsmith
Adobe Premiere Pro: A Complete Course and Compendium of Features is your guide to creating, editing, and enhancing videos in Adobe Premiere Pro.
2. AutoDesk AutoCAD for Beginners by Infratech Civil
This AutoCAD book is the guide used with the Infratech Civil’s online training course for Beginners.
3. Becoming a Salesforce Certified Architect by Tameem Bahri
Design and build high-performance, secure, and scalable Salesforce solutions to meet business demands and gain practical experience using real-world scenarios by creating engaging end-to-end solution presentations
4. Beginning Rust Programming by Ric Messier
This is not your typical programming book! Jump right in with interesting, useful programs, some of which are drawn from classic computer science problems as a way of talking about the programming constructs in the language rather than explaining everything in a dry, theoretical manner that doesn’t translate well to implementation.
5. Complete Guide in AutoCAD 2021 2D and 3D by Mike Meyers
This book contains a detailed explanation of AutoCAD commands and their applications to solve drafting and design problems. Every command is thoroughly explained with the help of examples and illustrations.
6. Computer Science Basic for All by Hong M. Lei
Computer Science is one of the disciplines of modern science under which, we study about the various aspects of computer technologies, their development, and their applications inthe present world.
7. Deno Web Development by Alexandre Portela dos Santos
8. Effortless App Development with Oracle by Ankur Jain
Build web and mobile applications quickly using the Oracle Visual Builder cloud service, and delve into real-time end-to-end use cases along with exploring best practices and recommendations
9. Emerging Technologies by Errol S. van Engelen
The convergence of blockchain and Internet of things (IoT) powered by data and artificial intelligence (AI) is on the agenda of several big companies and some of them have already started using its implementations, initiatives, and solutions in various projects. In this book, the author calls the convergence of these three technologies: the blockchain of intelligent things.
10. Hands-On Data Visualization by Jack Dougherty, Ilya Ilyankou
Hands-On Data Visualization takes you step-by-step through tutorials, real-world examples, and online resources.
11. Integration of Cloud Computing with Internet by Monika Mangla
The book aims to integrate the aspects of IoT, Cloud computing and data analytics from diversified perspectives. The book also plans to discuss the recent research trends and advanced topics in the field which will be of interest to academicians and researchers working in this area.
12. Intelligent Human Systems Integration 2021 by Dario Russo
This book presents cutting-edge research on innovative human systems integration and human–machine interaction, with an emphasis on artificial intelligence and automation, as well as computational modeling and simulation.
13. Interpretable Machine Learning with Python by Serg Masis
Learn to build interpretable high-performance models with hands-on real-world examples.
14. Practical Hardware Pentesting by Jean-Georges Valle
Explore embedded systems pentesting by applying the most common attack techniques and patterns.
15. Quantum Computing Fundamentals by Chuck Easttom
Quantum computing is moving from advanced labs to real-world application, and opportunities for qualified quantum computing specialists are growing rapidly.
16. Social Media Marketing All-in-One by Michelle Krasniak
This book will show you how to exploit different parallel architectures to improve your code’s performance, scalability, and resilience. You’ll learn about seven concurrency models: threads and locks, functional programming, separating identity and state, actors, sequential processes, data parallelism, and the lambda architecture.
17. The Essential Python Programming by Lucia Parker
The best thing about Python is that it’s easy to learn and even easier to get up and running. By using tools like Django, for example, you can quickly bring your ideas and creations to life and start monetizing them in no time.
18. The Little Book of Java Programming by Huw Collingbourne
This book will teach you the secrets of text adventure programming – from beginner to advanced level. You will learn how to create Rooms and Treasures, how to let the player take and drop objects and how to save a games using serialization.
19. Theoretic Methods in Data Science by Miguel R.D. Rodrigues
Learn about the state-of-the-art at the interface between information theory and data science with this first unified treatment of the subject.