Try our new practice tests feature: configure your own test including the number of questions, objectives and time limits
The CompTIA 220-901 exam is one of two exams requires to obtain the CompTIA A+ (900 series). This exam will cover topics like networking, mobile devices and hardware and network troubleshooting.
According to Apple the Lightning cable can extend up to 2 meters (6.5 feet) and work optimally. You can likely find 3rd party (non-Apple) cables that are longer, however Apple onnly recommends and sells cables up to 2 meters.
Lightning is a proprietary computer bus and power connector created and designed by Apple Inc. and introduced on September 12, 2012 (2012-09-12), to replace its predecessor, the 30-pin dock connector. The Lightning connector is used to connect Apple mobile devices like iPhones, iPads, and iPods to host computers, external monitors, cameras, USB battery chargers, and other peripherals. Using 8 pins instead of 30, Lightning is much smaller than its predecessor, which was integrated with devices like the iPhone 4 and the iPad 2. The Lightning plug is symmetrical (same pins on either side), so it can be inserted into a Lightning receptacle in either orientation. The plug is indented on each side to match up with corresponding points inside the receptacle to retain the connection.
Lightning_(connector) - Wikipedia, the free encyclopediaThe Basic Input/Output System's (BIOS) main functionality when a computer is powered on is to perform a Power On Self Test (POST) which verifies that the required computer components are installed and functional.
In computing, BIOS (, BY-oss, -ohss; Basic Input/Output System, also known as the System BIOS, ROM BIOS, BIOS ROM or PC BIOS) is firmware used to provide runtime services for operating systems and programs and to perform hardware initialization during the booting process (power-on startup). The BIOS firmware comes pre-installed on an IBM PC or IBM PC compatible's system board and exists in some UEFI-based systems to maintain compatibility with operating systems that do not support UEFI native operation. The name originates from the Basic Input/Output System used in the CP/M operating system in 1975. The BIOS originally proprietary to the IBM PC has been reverse engineered by some companies (such as Phoenix Technologies) looking to create compatible systems. The interface of that original system serves as a de facto standard. The BIOS in modern PCs initializes and tests the system hardware components (Power-on self-test), and loads a boot loader from a mass storage device which then initializes a kernel. In the era of DOS, the BIOS provided BIOS interrupt calls for the keyboard, display, storage, and other input/output (I/O) devices that standardized an interface to application programs and the operating system. More recent operating systems do not use the BIOS interrupt calls after startup.Most BIOS implementations are specifically designed to work with a particular computer or motherboard model, by interfacing with various devices especially system chipset. Originally, BIOS firmware was stored in a ROM chip on the PC motherboard. In later computer systems, the BIOS contents are stored on flash
BIOS - Wikipedia, the free encyclopediaApple's Thunderbolt 3 protocol uses the USB-C connector type. Thunderbolt 3 is also compatible with USB-C, so a Thunderbolt 3 port will work with both a Thunderbolt 3 and USB-C connector. Similarly the Thunderbolt versions 1 and 2 use a standard Display Port connector.
USB-C (formally known as USB Type-C) is a 24-pin USB connector system with a rotationally symmetrical connectorThe USB Type-C Specification 10 was published by the USB Implementers Forum (USB-IF) and was finalized in August 2014 It was developed at roughly the same time as the USB 31 specification In July 2016, it was adopted by the IEC as "IEC 62680-1-3"A device with a Type-C connector does not necessarily implement USB, USB Power Delivery, or any Alternate Mode: the Type-C connector is common to several technologies while mandating only a few of themUSB 32, released in September 2017, replaces the USB 3
USB_Type-C - Wikipedia, the free encyclopediaThe copper wire can extend up to 4.5 meters (15 feet) and was replaced by Thunderbolt in newer Macintosh models.
IEEE 1394 is an interface standard for a serial bus for high-speed communications and isochronous real-time data transfer. It was developed in the late 1980s and early 1990s by Apple in cooperation with a number of companies, primarily Sony and Panasonic. Apple called the interface FireWire. It is also known by the brand names i.LINK (Sony), and Lynx (Texas Instruments). The copper cable used in its most common implementation can be up to 4.5 metres (15 ft) long. Power and data is carried over this cable, allowing devices with moderate power requirements to operate without a separate power supply. FireWire is also available in Cat 5 and optical fiber versions. The 1394 interface is comparable to USB. USB was developed subsequently and gained much greater market share. USB requires a host controller whereas IEEE 1394 is cooperatively managed by the connected devices.
IEEE_1394 - Wikipedia, the free encyclopediaWhat Digital Visual Interface (DVI) connector types does not provide support for digital signal transmission?
DVI-A is the connector type that uses analog only, and does not support digital signal transmission.
Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source, such as a video display controller, to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of uncompressed digital video content. Featuring support for analog connections, DVI devices manufactured as DVI-I are compatible with the analog VGA interface by including VGA pins, while DVI-D devices are digital-only. This compatibility, along with other advantages, led to its widespread acceptance over competing digital display standards Plug and Display (P&D) and Digital Flat Panel (DFP). Although DVI is predominantly associated with computers, it is sometimes used in other consumer electronics such as television sets and DVD players.
Digital_Visual_Interface - Wikipedia, the free encyclopediaLightning is the connector for Apple's line of handheld devices . Thunderbolt is the connector type used for Apple laptop and notebooks.
Lightning is a proprietary computer bus and power connector created and designed by Apple Inc. and introduced on September 12, 2012 (2012-09-12), to replace its predecessor, the 30-pin dock connector. The Lightning connector is used to connect Apple mobile devices like iPhones, iPads, and iPods to host computers, external monitors, cameras, USB battery chargers, and other peripherals. Using 8 pins instead of 30, Lightning is much smaller than its predecessor, which was integrated with devices like the iPhone 4 and the iPad 2. The Lightning plug is symmetrical (same pins on either side), so it can be inserted into a Lightning receptacle in either orientation. The plug is indented on each side to match up with corresponding points inside the receptacle to retain the connection.
Lightning_(connector) - Wikipedia, the free encyclopediaFront-side bus (FSB) is between the CPU and northbridge, while the Back-side bus connects the CPU to the cache.
The front-side bus (FSB) is a computer communication interface (bus) that was often used in Intel-chip-based computers during the 1990s and 2000s. The EV6 bus served the same function for competing AMD CPUs. Both typically carry data between the central processing unit (CPU) and a memory controller hub, known as the northbridge.Depending on the implementation, some computers may also have a back-side bus that connects the CPU to the cache. This bus and the cache connected to it are faster than accessing the system memory (or RAM) via the front-side bus. The speed of the front side bus is often used as an important measure of the performance of a computer. The original front-side bus architecture has been replaced by HyperTransport, Intel QuickPath Interconnect or Direct Media Interface in modern volume CPUs.
Front-side_bus - Wikipedia, the free encyclopediaA type of Central Processing Unit (CPU) architecture where a single physical CPU contains more than one execution core on a single die or chip is known as:
Having a multicore CPU allows computers to execute multiple instructions at one time.
A multi-core processor is a microprocessor on a single integrated circuit with two or more separate processing units, called cores, each of which reads and executes program instructions. The instructions are ordinary CPU instructions (such as add, move data, and branch) but the single processor can run instructions on separate cores at the same time, increasing overall speed for programs that support multithreading or other parallel computing techniques. Manufacturers typically integrate the cores onto a single integrated circuit die (known as a chip multiprocessor or CMP) or onto multiple dies in a single chip package. The microprocessors currently used in almost all personal computers are multi-core. A multi-core processor implements multiprocessing in a single physical package. Designers may couple cores in a multi-core device tightly or loosely. For example, cores may or may not share caches, and they may implement message passing or shared-memory inter-core communication methods. Common network topologies used to interconnect cores include bus, ring, two-dimensional mesh, and crossbar. Homogeneous multi-core systems include only identical cores; heterogeneous multi-core systems have cores that are not identical (e.g. big.LITTLE have heterogeneous cores that share the same instruction set, while AMD Accelerated Processing Units have cores that do not share the same instruction set). Just as with single-processor systems, cores in multi-core systems may implement architectures such as VLIW, superscalar, vector, or multithreading. Multi-core processors are widely used across many application domains, including general-purpose, embedded, network, digital signal processing (DSP), and graphics (GPU). Core count goes up to even dozens, and for
Multi-core_processor - Wikipedia, the free encyclopediaNo-eXecute allows the administrator to mark certain areas as "non-executable" which can protect the system against different forms of Malware.
The NX bit (no-execute) is a technology used in CPUs to segregate areas of a virtual address space for use by either storage of processor instructions or for storage of data. An operating system with support for the NX bit may mark certain areas of an address space as non-executable. The processor will then refuse to execute any code residing in these areas of the address space. The general technique, known as executable space protection, also called Write XOR Execute, is used to prevent certain types of malicious software from taking over computers by inserting their code into another program's data storage area and running their own code from within this section; one class of such attacks is known as the buffer overflow attack. The term NX bit originated with Advanced Micro Devices (AMD), as a marketing term. Intel markets the feature as the XD bit (execute disable). The MIPS architecture refers to the feature as XI bit (execute inhibit). The ARM architecture refers to the feature, which was introduced in ARMv6, as XN (execute never). The term NX bit itself is sometimes used to describe similar technologies in other processors.
NX_bit - Wikipedia, the free encyclopediaIntegrated GPU (Graphics Processing Unit) or Integrated Graphics is a technology that allows a motherboard to handle graphics generation without a dedicated expansion card with a GPU. Generally this requires that some of your system memory (RAM) be assigned to the GPU instead of the CPU and results in slower performance.
A graphics processing unit (GPU) is a specialized electronic circuit designed to manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. GPUs are used in embedded systems, mobile phones, personal computers, workstations, and game consoles. Modern GPUs are efficient at manipulating computer graphics and image processing. Their parallel structure makes them more efficient than general-purpose central processing units (CPUs) for algorithms that process large blocks of data in parallel. In a personal computer, a GPU can be present on a video card or embedded on the motherboard. In some CPUs, they are embedded on the CPU die.
Graphics_processing_unit - Wikipedia, the free encyclopediaIf a system fails to auto-configure the newly added adapter card through PnP, finishing the configuration process requires manual installation of device drivers.
True, if a Plug and Play (PnP) device is not supported the driver must be manually installed by the administrator to find the necessary drivers on the product website or material provided.
Thunderbolt 3 uses Universal Serial Bus (USB) type C. The older versions of Apple products (Thunderbolt 1, and 2) use the Mini DisplayPort (MDP).
Thunderbolt is the brand name of a hardware interface for the connection of external peripherals to a computer. It has been developed by Intel, in collaboration with Apple. It was initially marketed under the name Light Peak, and first sold as part of an end-user product on 24 February 2011.Thunderbolt combines PCI Express (PCIe) and DisplayPort (DP) into two serial signals, and additionally provides DC power, all in one cable. Up to six peripherals may be supported by one connector through various topologies. Thunderbolt 1 and 2 use the same connector as Mini DisplayPort (MDP), whereas Thunderbolt 3 and 4 reuse the USB-C connector from USB.
Thunderbolt_(interface) - Wikipedia, the free encyclopediaWhich of the terms listed below refers to a situation where a single Central Processing Unit (CPU) appears as two virtual CPUs to the operating system?
Hyper-threading makes multitasking on your system much easier, allowing two or more processes to run using the same resources.
Hyper-threading (officially called Hyper-Threading Technology or HT Technology and abbreviated as HTT or HT) is Intel's proprietary simultaneous multithreading (SMT) implementation used to improve parallelization of computations (doing multiple tasks at once) performed on x86 microprocessors. It was introduced on Xeon server processors in February 2002 and on Pentium 4 desktop processors in November 2002. Since then, Intel has included this technology in Itanium, Atom, and Core 'i' Series CPUs, among others.For each processor core that is physically present, the operating system addresses two virtual (logical) cores and shares the workload between them when possible. The main function of hyper-threading is to increase the number of independent instructions in the pipeline; it takes advantage of superscalar architecture, in which multiple instructions operate on separate data in parallel. With HTT, one physical core appears as two processors to the operating system, allowing concurrent scheduling of two processes per core. In addition, two or more processes can use the same resources: If resources for one process are not available, then another process can continue if its resources are available. In addition to requiring simultaneous multithreading support in the operating system, hyper-threading can be properly utilized only with an operating system specifically optimized for it.
Hyper-threading - Wikipedia, the free encyclopediaAfter installing a new integrated component such as a Network Interface Card (NIC) on a newly assembled computer system you should first:
If the computer is newly assembled it likely needs the device driver before the NIC will be functional. It is possible that the device could be disabled in the Windows control panel and BIOS however this is not a default functionality of either system.
A 64-bit CPU can be used either with a 32-bit or 64-bit version of the Microsoft Windows operating system.
64-bit CPUs are compatible with 32-bit Microsoft operating systems, although it performs best with 64-bit.
In computer architecture, 64-bit integers, memory addresses, or other data units are those that are 64 bits wide. Also, 64-bit CPUs and ALUs are those that are based on processor registers, address buses, or data buses of that size. A computer that uses such a processor is a 64-bit computer. From the software perspective, 64-bit computing means the use of machine code with 64-bit virtual memory addresses. However, not all 64-bit instruction sets support full 64-bit virtual memory addresses; x86-64 and ARMv8, for example, support only 48 bits of virtual address, with the remaining 16 bits of the virtual address required to be all 0's or all 1's, and several 64-bit instruction sets support fewer than 64 bits of physical memory address. The term 64-bit also describes a generation of computers in which 64-bit processors are the norm. 64 bits is a word size that defines certain classes of computer architecture, buses, memory, and CPUs and, by extension, the software that runs on them. 64-bit CPUs have been used in supercomputers since the 1970s (Cray-1, 1975) and in reduced instruction set computers (RISC) based workstations and servers since the early 1990s. In 2003, 64-bit CPUs were introduced to the mainstream PC market in the form of x86-64 processors and the PowerPC G5. A 64-bit register can hold any of 264 (over 18 quintillion or 1.8×1019) different values. The range of integer values that can be stored in 64 bits depends on the integer representation used. With the two most common representations,
64-bit_computing - Wikipedia, the free encyclopediaThe Real Time Clock (RTC) is a chip embedded in most electronic devices. The RTC keeps the time for a device and must have an alternative power source to use when the device is powered off. For computers this is the CMOS battery.
A real-time clock (RTC) is an electronic device (most often in the form of an integrated circuit) that measures the passage of time. Although the term often refers to the devices in personal computers, servers and embedded systems, RTCs are present in almost any electronic device which needs to keep accurate time of day.
Real-time_clock - Wikipedia, the free encyclopediaMicro typically means a smaller form factor used for laptops. The amount of pins it has is 214.
A DIMM () (Dual In-line Memory Module), commonly called a RAM stick, comprises a series of dynamic random-access memory integrated circuits. These memory modules are mounted on a printed circuit board and designed for use in personal computers, workstations, printers, and servers. They are the predominant method for adding memory into a computer system. The vast majority of DIMMs are standardized through JEDEC standards, although there are proprietary DIMMs. DIMMs come in a variety of speeds and sizes, but generally are one of two lengths - PC which are 133.35 mm (5.25 in) and laptop (SO-DIMM) which are about half the size at 67.60 mm (2.66 in).
DIMM - Wikipedia, the free encyclopediaA single IEEE 1394 (Institute of Electrical and Electronics Engineers) host controller can support up to how many devices at a given time?
If using a daisy chain IEEE 1394 (FireWire) can support up to 63 devices at one time.
IEEE 1394 is an interface standard for a serial bus for high-speed communications and isochronous real-time data transfer. It was developed in the late 1980s and early 1990s by Apple in cooperation with a number of companies, primarily Sony and Panasonic. Apple called the interface FireWire. It is also known by the brand names i.LINK (Sony), and Lynx (Texas Instruments). The copper cable used in its most common implementation can be up to 4.5 metres (15 ft) long. Power and data is carried over this cable, allowing devices with moderate power requirements to operate without a separate power supply. FireWire is also available in Cat 5 and optical fiber versions. The 1394 interface is comparable to USB. USB was developed subsequently and gained much greater market share. USB requires a host controller whereas IEEE 1394 is cooperatively managed by the connected devices.
IEEE_1394 - Wikipedia, the free encyclopediaWhat integrated circuit types best defines the core functionality and capabilities of a motherboard?
The chipset manages the flow of data between the core components of the motherboard, such as memory, processor, and other peripherals.
In a computer system, a chipset is a set of electronic components on one or more ULSI integrated circuits known as a "Data Flow Management System" that manages the data flow between the processor, memory and peripherals. It is usually found on the motherboard of computers. Chipsets are usually designed to work with a specific family of microprocessors. Because it controls communications between the processor and external devices, the chipset plays a crucial role in determining system performance.
Chipset - Wikipedia, the free encyclopediaThe Micro Serial Advanced Technology Attachment (SATA) standard defines data cable connector consisting of:
Be sure to carefully read the question! The question specifically asks about a SATA data cable connector which is 7 pins. A SATA power cable uses 15 pins so this is answer is close but still incorrect.
Serial ATA (SATA, abbreviated from Serial AT Attachment) is a computer bus interface that connects host bus adapters to mass storage devices such as hard disk drives, optical drives, and solid-state drives. Serial ATA succeeded the earlier Parallel ATA (PATA) standard to become the predominant interface for storage devices. Serial ATA industry compatibility specifications originate from the Serial ATA International Organization (SATA-IO) which are then promulgated by the INCITS Technical Committee T13, AT Attachment (INCITS T13).
Serial_ATA - Wikipedia, the free encyclopediaLooks like thats it! You can go back and review your answers or click the button below to grade your test.