top of page

Forum Posts

yokeser683
Feb 04, 2022
In General Discussions
Natural leather is an expensive material usually used for shoes, belts, jackets, purses, and other articles of clothing and accessories. Leather can become discolored over time, in which case you can lighten it to save money on replacing it. You can also try to lighten a leather item to match other leather items you have. The process of lightening leather does not give you guaranteed results, so you should test in an inconspicuous area before lightening the entire piece. Leather often has a clear protective top finish, which must be removed before lightening the leather. Removing the top trim Step 1. Dip a clean cloth in leather degasser or acetone and wring out excess product. Wear rubber gloves to protect your hands from irritation. Step 2 Rub the degasser or acetone directly onto a piece of leather.+ Step 3 Wipe the degasser with a clean cloth to remove the top finish. Allow to dry.+ Step 4 Turn to a clean piece of cloth or use a new cloth as the old one gets dirty.+ Color Bleach Step 1. Mix a solution of oxalic acid, a type of bleach for leather and wood, with water to create a bleaching solution for skin coloring. Oxalic acid is commonly used to cleanse the skin; Follow the mixing directions for the cleaner, but add more oxalic acid to achieve a greater brightening effect.+ Step 2 Rub the solution onto your skin with a clean cloth. Allow it to dry to observe the results, then repeat the process, adding more oxalic acid to the water if more results are desired.+ Step 3 Soak your skin in the solution at 10-minute intervals, allowing it to dry between soaks to observe until the desired results are achieved. Try this step at your own risk, and only if rubbing the solution does not produce sufficient results.+ Step 4 Dye the leather with leather dye if you like. There are dye, and dip-style dyes explicitly made for leather.
How do I lighten my leather? content media
0
0
2
yokeser683
Feb 04, 2022
In General Discussions
A popular science article about processor microcode and the direction the "iron" industry might take. The best way to predict the future is to create it (Bob Noyce, founder of Intel). In this article, we will talk about the possible development of the microprocessor industry. This industry is evolving in several directions: Updating the element base (nanotubes, other silicon substitutes) Introduction of absolutely new architecture differing from Neumann's (for example, neural networks) Update of existing architecture (multicore, 64-bit port). Research robots use these three main directions. Everyone, without exception, from Intel and IBM to the manufacturers of cascade amplification chips, is updating the existing architecture. The processor cores, optimizing bus signaling, etc. But what is missing is an important optimizing parameter that was invented back in the 70s of the last century. So we are talking about microcode. And one of the variants of the development of the processor industry is the implementation of microprocessors with the possibility of change of microcode on the fly. The user himself can change the instructions of his processor for specific tasks. Or programs will be able to ask the processor on the fly which instructions are better for performing tasks. So, for example, the user launches 3D Studio MAX, and the processor is "enveloped" with commands optimized for rendering and launching video conversion for iPhone, the processor "turns" in a hardware-optimized codec. Do you think it's not real? Now this will be a revelation to many. No one is nervous, take a deep breath and go. If someone thinks that assembly language programming is "direct" communication with a computer, he is mistaken. Assembler is a high-level language (albeit the lowest-level of all the other high-level languages). Yes, the assembly language in which drivers and shellcodes are written is far from being a tete-a-tete between a programmer and a piece of silicon. Each instruction in the "low-level" assembler consists of several micro-commands. These micro-commands are the microcode, which you can read about here and here. You should study more about .NET Framework and Action Script 3, and you will understand the architecture of the microprocessor. But seriously, each assembly instruction is a mnemonic for a group of microcommands, an example of an assembly instruction: ADD EAX,5; // Add 5 to the EAX register and put the result into the same EAX But what the processor actually has to do from this instruction: Read EAX register value Read value of number 5 Add Write the result to EAX From this simple example, you see that ADD is just a link in the microcode table to a sequence of microcommands. And this sequence can be changed and flashed in the central processor! A word and a half about the architecture of modern processors If you take a Pentium family stone and look under an X-ray, we see that the external commands are implemented according to the CISC philosophy, but all commands are broken down into elementary microcommands based on the RISC kernel. The decoded microcommands correspond directly to those elementary operations that hardware can execute; they are much simpler than machine language commands. So CISC and RISC are two different schools that Intel has combined on one chip: The CISC school says, "Go buy bread." The RISC school says, "Get dressed, get dressed, walk to the store, buy bread, come back." Since dressing and shoeing can be used for more than just going to the store, these commands are called elementary commands. And accordingly, they are executed faster and more often in various high-level assembler commands. Why did they introduce microcodes at all? The central processing unit has to perform a unifying function in its nature. It has to manage the various hardware components of a computer, such as disk drives, graphical displays, and network interfaces, to ensure that they all work together in a coherent manner. This means that central desktop processors have a complex architecture because they must support basic functions such as memory protection, integer arithmetic, floating-point operations, and vector graphics processing. As a result, a typical modern central processing unit supports several hundred instructions that provide all these functions. Consequently, a command decoding module is needed to implement a complex command dictionary and many integrated circuits. They actually have to perform the actions defined by the commands. In other words, a typical processor in a desktop computer contains tens of millions of transistors. Microcodes also simplify the debugging of microprocessors. For example, after debugging one elementary micro-commands used in many other higher-level commands, we are already sure that this micro-code will work correctly in the new commands. What can we get from programming our own processor commands? For starters, let us remember the holy wars of the 1990s about the need for 3D gas pedals in personal computers. Nowadays, graphics chips are even integrated into the motherboard. And some video cards have more cache and RAM than the CPU and motherboard. What does the video card processor have that the CPU doesn't? That's right, dedicated hardware-optimized graphics processing instructions. Also, video chips are optimized for floating-point robots. So they are dozens of times faster at processing graphics than the CPU, and most importantly, the latter can now do their (non-graphics) thing. Moreover, lately, a tendency to count even mathematics on video processors has increased. Some optimization of high-level graphics processing chip instructions made them more efficient for calculations of large data sets (graphics processing is a special case of such calculations). You can read about the NVIDIA CUDA project for more details about this. The bottom line Today we have taken a closer look at the possible prospects of updating the microprocessor architecture in the light of reprogramming and adding our own processor instructions. I am sure that the potential of the microcode is still as dormant as virtual space technology, for example. And if you imagine yourself in the shoes of the CEO of a company that develops a new device. For this, they have to develop a microchip, i.e., to make the circuit design and order a test batch at the factory. The factory makes the pieces of silicon that have to be soldered into the overall circuit, test, correct errors, reorder the batch with corrections, etc. And there is another option, to buy a reprogrammable microprocessor and quietly on it as many times as you want to debug all the instructions. I think this approach has a future. This article is backed by Robert Brown (NerdyTechy), Ivan Kuleshov, Carl Bugeja, and Robin Reiter.
0
0
4

yokeser683

More actions
bottom of page