Thursday 26 February 2015

What Is ISL Uranium Mining

In situ leach mining (ISL), also known as in-situ mining or solution mining, was first used as a means to extract low grades of uranium from ore in underground mines. First used in Wyoming in the 1950s, originally as a low production experiment at the Lucky June mine, it became a high-production, low cost method of fulfilling Atomic Energy Commission uranium requirements at Utah Construction Company's Shirley Basin mining operations in the 1960s. Pioneered through the efforts of Charles Don Snow, a uranium mining and exploration geologist employed by Utah, many of his developments are still used today in ISL mining.

What is ISL mining? According to the Wyoming Mining Association website, ISL mining is explained in the following manner. (We choose Wyoming because it is the birthplace of "solution mining" as it was originally called.)

"In-situ mining is a noninvasive, environmentally friendly mining process involving minimal surface disturbance which extracts uranium from porous sandstone aquifers by reversing the natural processes which deposited the uranium.

To be mined in situ, the uranium deposit must occur in permeable sandstone aquifers. These sandstone aquifers provide the "plumbing system" for both the original emplacement and the recovery of the uranium. The uranium was emplaced by weakly oxidizing ground water which moved through the plumbing systems of the geologic formation. To effectively extract uranium deposited from ground water, a company must first thoroughly define this plumbing system and then designs well fields that best fit the natural hydro-geological conditions.

Detailed mapping techniques, using geophysical data from standard logging tools, have been developed by uranium companies. These innovative mapping methods define the geologic controls of the original solutions, so that these same routes can be retraced for effective in situ leaching of the ore. Once the geometry of the ore bodies is known, the locations of injection and recovery wells are planned to effectively contact the uranium. This technique has been used in several thousand wells covering hundreds of acres.

Following the installation of the well field, a leaching solution (or lixiviant), consisting of native ground water containing dissolved oxygen and carbon dioxide, is delivered to the uranium-bearing strata through the injection wells. Once in contact with the mineralization, the lixiviant oxidizes the uranium minerals, which allows the uranium to dissolve in the ground water. Production wells, located between the injection wells, intercept the pregnant lixiviant and pump it to the surface. A centralized ion-exchange facility extracts the uranium from the barren lixiviant, stripped of uranium, is regenerated with oxygen and carbon dioxide and recirculated for continued leaching. The ion exchange resin, which becomes 'loaded' with uranium, it is stripped or eluted. Once eluted, the ion exchange resin is returned to the well field facility.

During the mining process, slightly more water is produced from the ore-bearing formation than is reinjected. This net withdrawal, or 'bleed,' produces a cone of depression in the mining area, controlling fluid flow and confining it to the mining zone. The mined aquifer is surrounded, both laterally and above and below, by monitor wells which are frequently sampled to ensure that all mining fluids are retained within the mining zone. The 'bleed' also provides a chemical bleed on the aquifer to limit the buildup of species like sulfate and chloride which are affected by the leaching process. The 'bleed' water is treated for removal of uranium and radium. This treated water is then disposed of through waste water land application, or irrigation. A very small volume of radioactive sludge results; this sludge is disposed of at an NRC licensed uranium tailings facility.

The ion exchange resin is stripped of its uranium, and the resulting rich eluate is precipitated to produce a yellow cake slurry. This slurry is dewatered and dried to a final drummed uranium concentrate.

At the conclusion of the leaching process in a well field area, the same injection and production wells and surface facilities are used for restoration of the affected ground water. Ground water restoration is accomplished in three ways. First, the water in the leach zone is removed by "ground water sweep", and native ground water flows in to replace the removed contaminated water. The water which is removed is again treated to remove radionuclides and disposed of in irrigation. Second, the water which is removed is processed to purify it, typically with reverse osmosis, and the pure water is injected into the affected aquifer. This reinjection of very pure water results in a large increment of water quality improvement in a short time period. Third, the soluble metal ions which resulted from the oxidation of the ore zone are chemically immobilized by injecting a reducing chemical into the ore zone, immobilizing these constituents in situ. Ground water restoration is continued until the affected water is suit
able for its pre-mining use.

Throughout the leaching and restoration processes, a company ensures the isolation of the leach zone by careful well placement and construction. The well fields are extensively monitored to prevent the contamination of other aquifers.

Once mining is complete, the aquifer is restored by pumping fresh water through the aquifer until the ground water meets the pre-mining use.

In situ mining has several advantages over conventional mining. First, the environmental impact is minimal, as the affected water is restored at the conclusion of mining. Second, it is lower cost, allowing Wyoming's low grade deposits to compete globally with the very high grade deposits of Canada. Finally the method is safe and proven, resulting in minimal employee exposure to health risks."

ISL mining may be the wave of the future of U.S. uranium mining, or it may become an interim mining measure, in areas where the geology is appropriate for IS. Until sufficient quantities of uranium are required by U.S. utilities to fuel the country's demand for nuclear energy, ISL mining may remain the leading uranium mining method in the United States. At some point, an overwhelming need for uranium for the nuclear fuel cycle may again put ISL mining in the backseat, and uranium miners may return to conventional mining methods, such as open pit mining.

Source: http://ezinearticles.com/?What-Is-ISL-Uranium-Mining&id=183880

Sunday 22 February 2015

New Technique to Boost US Uranium Mining - Satellite Plants

If you study the news releases, several companies have discussed the setting up of one or more satellite plants in conjunction with their In Situ Recovery (ISR) uranium mining operations. In order to help readers better understand what exactly a 'satellite plant' is, we interviewed Mark Pelizza of Uranium Resources about how this relatively new operational technique is presently being used at the company's Texas operations. This is part two our six-part series, describing the evolution of ISR uranium mining, building upon last year's basic series on this subject.

A larger uranium deposit, such as one at Cameco's Smith Ranch in Wyoming, requires a Central Processing Plant. The 'mother plant,' as it is called in the trade, can complete the entire processing cycle from uranium extraction through loading the resin, stripping the uranium from the resin with a solvent (elution), precipitating, drying and packaging.

With a satellite plant, also known as a Remote Ion Exchange (RIX), smaller and distant deposits can also be mined and then trucked to the mother plant. With an RIX operation, the front-end of the 'milling' cycle can be begun independent of the much larger mother plant. It is the same ion exchange column found at central processing facility. The mobility factor makes RIX an attractive proposition for many of the new-breed uranium producers. Rather than piping the water and uranium across a longer distance to the mother plant for the entire processing cycle, the modular nature of RIX allows for multiple columns at each well field doing the ion exchange on the spot.

This is not a new idea, but one which has instead been re-designed by Uranium Resources and is also used elsewhere. In the early 1970s, Conoco and Pioneer Nuclear Corporation formed the Conquista project in south Texas. Uranium was open-pit mined at between ten and fifteen mines within a thirty-five mile radius and in two counties. Trucks hauled ore to the 1750-ton/day processing mill near Falls City in Karnes County.

"The trademark of south Texas is a lot of small million-pound-style deposits," Mark Pelizza told us. "I think we are heading in the right direction to exploit those small deposits." Trucking resin beads loaded with uranium is different from trucking ore which has been conventionally mined. Small, scattered uranium deposits aren't only found in Texas. There are numerous smaller ISR-amenable properties in Wyoming, New Mexico, Colorado and South Dakota.

"About half the uranium deposits in New Mexico can be mined with ISR," Pelizza said, "and the other half would require conventional mining." A number of companies we've interviewed have geographically diverse, but relatively nearby properties within their portfolio. Several companies with whom we discussed RIX have already made plans to incorporate this method into their mining operations.

The sole-use semi-trailer trucks hauling the yellowcake slurry are different from the typical dump trucks used in conventional mining. According to Pelizza, the truck carries a modified bulk cement trailer with three compartments. The three compartments, or cells, each have a function. One cell holds the uranium-loaded resin, one cell is empty and the third has unloaded resin.

As per Department of Transportation (DOT) regulations, no liquids are permitted during the transportation process. Each container run between the wellfield and the mother plant can bring between 2,000 and 3,000 pounds of uranium-in-resin, depending upon how large the container is designed. The 'loaded' cell holds between 300 and 500 pounds of resin with six to eight pounds of uranium per cubic foot of resin. Age of the resin is important, too. New resin can hold up to ten pounds of uranium per cubic foot and can decline to five pounds of uranium per cubic foot after several years.

As we found with a conventional Ion Exchange process, the RIX system is run as a closed loop pressurized process to prevent the release of radon gas into the atmosphere. The uranium is oxidized, mobilized and pumped out of the sandstone formation into a loaded pipeline and ends up in an ion exchange column at the mining site. Inside the columns, uranium is extracted through an ion exchange process - a chloride ion on a resin bead exchanges for a uranium ion. After the fluid has been stripped of uranium, it is sent back to the wellfield as barren solution, minus the bleed.

When the ion exchange column is fully loaded, the column is taken offline. The loaded resin is transferred from the column to a bulk cement trailer, which is a pressurized vessel comprised of carbon steel with a rubberized internal lining. The resin trailer is connected to the ion exchange column transfer piping with hoses. After it has been drained of any free water, the uranium-loaded resin can be transported as a solid, known as 'wet yellowcake' to the mother plant. There, the yellowcake slurry is stripped from the resin, precipitated and vacuum-dried with a commercial-grade food dryer.

Capital costs can be dramatically reduced with the satellite plants, or RIX units. "Well field installation can cost more than RIX," Pelizza noted. Often, installing a well field can start at approximately $10 million and run multiples higher, depending upon the spacing of the wells and the depth at which uranium is mined. Still, compared to conventional mining, the entire ISR well field mining and solvent circuit method of uranium processing is relatively inexpensive.

We checked with a number of near-term producers - those with uranium projects in Wyoming - and discovered at least three companies planned to utilize one or more satellite plants, or RIX, in their operations. A company's reason for utilizing this method is to minimize capital and operating expenses while mining multiple smaller deposits within the same area. Water is treated at the RIX to extract the uranium instead of piping it across greater distances to a full-sized plant. Pelizza said, "The potential for pipeline failure and spillage from a high-flow trunk line is eliminated."

Strathmore Minerals vice president of technical services John DeJoia said his company was moving forward with a new type of Remote Ion Exchange design, but would not provide details. UR-Energy chief executive Bill Boberg said his company would use an RIX for either Lost Soldier or Lost Creek in Wyoming, perhaps for both. Uranerz Energy chief executive Glenn Catchpole told us he planned to probably set up two RIX operations at the company's Wyoming properties and build a central processing facility.

"We are working on a standardized design of the remote ion exchange unit so it doesn't require any major licensing action," Pelizza said. "If you can speed up the licensing time, perhaps it would take one to two years rather than three to five years."

Source:http://ezinearticles.com/?New-Technique-to-Boost-US-Uranium-Mining---Satellite-Plants&id=495199

Thursday 19 February 2015

Data Mining vs Screen-Scraping

Data mining isn't screen-scraping. I know that some people in the room may disagree with that statement, but they're actually two almost completely different concepts.

In a nutshell, you might state it this way: screen-scraping allows you to get information, where data mining allows you to analyze information. That's a pretty big simplification, so I'll elaborate a bit.

The term "screen-scraping" comes from the old mainframe terminal days where people worked on computers with green and black screens containing only text. Screen-scraping was used to extract characters from the screens so that they could be analyzed. Fast-forwarding to the web world of today, screen-scraping now most commonly refers to extracting information from web sites. That is, computer programs can "crawl" or "spider" through web sites, pulling out data. People often do this to build things like comparison shopping engines, archive web pages, or simply download text to a spreadsheet so that it can be filtered and analyzed.

Data mining, on the other hand, is defined by Wikipedia as the "practice of automatically searching large stores of data for patterns." In other words, you already have the data, and you're now analyzing it to learn useful things about it. Data mining often involves lots of complex algorithms based on statistical methods. It has nothing to do with how you got the data in the first place. In data mining you only care about analyzing what's already there.

The difficulty is that people who don't know the term "screen-scraping" will try Googling for anything that resembles it. We include a number of these terms on our web site to help such folks; for example, we created pages entitled Text Data Mining, Automated Data Collection, Web Site Data Extraction, and even Web Site Ripper (I suppose "scraping" is sort of like "ripping"). So it presents a bit of a problem-we don't necessarily want to perpetuate a misconception (i.e., screen-scraping = data mining), but we also have to use terminology that people will actually use.

Source: http://ezinearticles.com/?Data-Mining-vs-Screen-Scraping&id=146813

Wednesday 18 February 2015

There is No Need to Disrupt the Schedule to Keep the Kitchen Canopy and Extraction System Clean

After taking over a large and beautiful stately hotel its new owner quickly realised that the kitchen extract system would not be straightforward to maintain because the duct work for the extract system was somewhat ancient and therefore would be difficult to clean.

A prestige hotel needs to maintain a high level of hygiene as well as to minimise the risk of a kitchen fire.

So, if replacing the entire system is not an option what can the new owner do to find a solution that would meet exacting standards of cleanliness and ensure that the risk of a fire starting in the system is minimised while ensuring that the cleaning does disrupt the operation of the hotel and restaurant as a business?

Using an experienced specialist commercial cleaning service to asses the establishment, the types of food cooked, how and at what level of intensity is the first step.

It is difficult without this information to advice on how maintenance should be carried out.

The frequency of the cleaning cycle for a canopy and its components depends not only on the regularity and duration of cooking below but also on the type of cooking and the ingredients being used.

Where  the kitchen use is light canopies and extract systems may only need a 12-month cycle for maintenance and cleaning. However, in a busy hotel, kitchen activity is most likely to be heavy and the cleaning company may advise a three or four-month cycle.

Grease filters and canopies over the cookers should ideally be designed, sized and constructed to be robust enough for regular washing in a commercial dishwasher, which is the most thorough and efficient method of cleaning them yourself.

It's important to make sure when re-installing filters that they are fitted the right way around with any framework drain holes at the lowest, front edge. Of course, grease filters are covered with a coating of grease and can therefore be slippery and difficult to handle. Appropriate protyective gloves should be used when handling them.

The canopies and their component parts should be designed to be easy to clean, but if they are not, provided the cleaning intervals are fairly frequent, regular washing with soap or mild detergent and warm water, followed by a clean water rinse might be adequate. If too long a period is left between cleans, grease will become baked-on and require special attention.

No grease filtration is 100% efficient and therefore a certain amount of grease passes through the filters to be deposited on the internal surfaces of the filter housings and ductwork.

Left unattended, this layer of grease on the non-visible surfaces of the canopy creates both hygiene and fire risks.

Deciding on when cleaning should take place, and how often, is something an experienced specialist cleaning company can help with. The simplest guide is that if a surface or component looks dirty, then it needs cleaning.

Most important, however, is regular inspection of all surfaces and especially non-visible ones. The maintenance schedule for any kitchen installation should include inspections.

Copyright (c) 2010 Alison Withers

A regular maintenance and cleaning schedule is not impossible even in the kitchen of a hotel with an antiquated canopy and duct system with the help of a specialist commercial cleaning company to advise on how to do it without disrupting the work flow, as writer Ali Withers discovers.

Source: http://ezinearticles.com/?There-is-No-Need-to-Disrupt-the-Schedule-to-Keep-the-Kitchen-Canopy-and-Extraction-System-Clean&id=4877266

Sunday 15 February 2015

The Trouble With Bots, Spiders and Scrapers

With the Q4 State of the Internet - Security Report due out later this month, we continue to preview sections of it.

Earlier this week we told you about a DDoS attack from a group claiming to be Lizard Squad. Today we look at how
third-party content bots and scrapers are becoming more prevalent as developers seek to gather, store, sort and present
a wealth of information available from other websites.

These meta searches typically use APIs to access data, but many now use screen-scraping to collect information.

As the use of bots and scrapers continues to surge, there's an increased burden on webservers. While bot behavior is
mainly harmless, poorly-coded bots can hurt site performance and resemble DDoS attacks. Or, they may be part of a rival's competitive intelligence program.

Understanding the different categories of third-party content bots, how they affect a website, and how to mitigate their impact is an important part of building a secure web presence.

Specifically, Akamai has seen bots and scrapers used for such purposes as:

•    Setting up fraudulent sites
•    Reuse of consumer price indices
•    Analysis of corporate financial statements
•    Metasearch engines
•    Search engines
•    Data mashups
•    Analysis of stock portfolios
•    Competitive intelligence
•    Location tracking

During 2014 Akamai observed a substantial increase in the number of bots and scrapers hitting the travel, hotel and hospitality sectors. The growth in scrapers targeting these sectors is likely driven by the rise of rapidly developed mobile apps that use scrapers as the fastest and easiest way to collect information from disparate websites.

Scrapers target room rate pages for hotels, pricing and schedules for airlines. In many cases that Akamai investigated, scrapers and bots made several thousand requests per second, far in excess of what can be expected by a human using a web browser.

An interesting development in the use of headless browsers is the advent of companies that offer scraping as a service, such as PhantomJs Cloud. These sites make it easy for users to scrape content and have it delivered, lowering the bar to entry and making it easier for unskilled individuals to scrape content while hiding behind a service.

For each type of bot, there is a corresponding mitigation strategy.

The key to mitigating aggressive, undesirable bots is to reduce their efficiency. In most cases, highly aggressive bots are only helpful to their controllers if they can scrape a lot of content very quickly. By reducing the efficiency of the bot through rate controls, tar pits or spider traps, bot-herders can be driven elsewhere for the data they need.

Aggressive but desirable bots are a slightly different problem. These bots adversely impact operations, but they bring a benefit to the organization. Therefore, it is impractical to block them fully. Rate controls with a high threshold, or a user-prioritization application (UPA) product, are a good way to minimize the impact of a bot. This permits the bot access to the site until the number of requests reaches a set threshold, at which point the bot is blocked or sent to a waiting room. In the meantime, legitimate users are able to access the site normally.

Source: https://blogs.akamai.com/2015/01/performance-mitigation-bots-spiders-and-scrapers.html

Monday 9 February 2015

Application of Web Data Mining in CRM

The process of improvising the customer relations and interactions and making them more amicable may be termed as Customer relationship management (CRM). Since web data mining is used in the utilization of the various modeling and data analysis methods in detecting given patterns and relationships in the data, it can be used as an effective tool in CRM. By the effectively using web data mining you are able to understand what your customers what.

It is important to note that web data mining can be used effectively in searching for the right and potential customers to be offered the right products at the right time. The result of this in any business is the increase in the revenue generated. This is made possible as you are able to respond to each customer in an effective and efficient way. The method further utilizes very few resources and can be therefore termed as an economical method.

In the next paragraphs we discuss the basic process of customer relationship management and its integration with web data mining service. The following are the basic process that should be used in understanding what your customers need, sending them the right offers and products, and reducing the resources used in managing your customers.

Defining the business objective. Web data mining can be used to define and inform your customers your business objective. By doing research you can be able to determine whether your business objective is communicated well to your customers and clients. Does your business objective take interest in the customers? Your business goal must be clearly outlined in your business CRM. By having a more precise and defined goal is the possible way of ensuring success in the customer relationship management.

Source: http://www.loginworks.com/blogs/web-scraping-blogs/application-web-data-mining-crm/