October 24, 2011

Operations improvement in Small Manufacturing - Five Overlooked Areas to Mine during Tough Times

Even in tough times when they need to maximize the impact of every nickel, I have seen
many manufacturers burning money.

How?

Variable Frequency Drive Basics

Well, their attention is normally drawn to the obvious, and associates in tough times
usually tackle those: cutting personnel, eliminating extra programs, etc. However they
may lose money in the long run if they don't attend to other fundamental changes.
Here are a few ways among others that I'm customary with that are less obvious, yet can
make serious impact on the bottom line and should be thought about especially in this
economy:

- measuring the wrong capability characteristics

- improper sequencing and execution,

- over investing in the current "as is" environment prior to changes

- living with firm software that is sub optimally implemented or outdated.

- overlooking opportunities to furnish closer linkage in the middle of R&D and production

I write and share this article so possibly you, the reader, might find a nugget or two for
your own business. It's not copyrighted - you can duplicate it to your heart's article and
pass it to whomever you think would benefit from it.

Simple Example For Illustration Purposes

A major candy and candy firm was computerizing its furnish chain to make
things more efficient. All the facts of the raw ingredients, time to ship, quality
testing time, etc., were being integrated into one devotee system. In the process they
missed one piece of facts for synchronizing events: seasonal furnish chain events.
They practically missed Halloween! eventually with frenzied damage control, they were able
to gear up the output to meet the demand.

So, that's an clear example, and it's such a big Snafu that it's hardly likely your
company would make that mistake: your firm is not likely to miss it "Christmas
season." But it can miss many smaller practices that can cause bigger glitches. "A Stitch
in Time Saves Nine," as they say. Problem at the headwaters of output can get
amplified downstream.

Here are some illustrations from my feel that could contain requisite lessons for
anyone complex in process manufacturing:

1. What you part is as vital as how well you part it: or... "Product
shipped by weight, not volume; contents may decree while shipping."

That buyer warning on packages reminds us of every day occurrences: Focusing on
one parameter while measuring another. Why would you part volume if you're
shipping weight? Well, it can get confusing.

Think of a school bus. What color is it? Yellow, right? Not exactly. It's admittedly a
distinct mix of trace colors like red, blue, etc, which are added to the yellow in just the
right proportions that makes the bus so recognizable in dim light or stormy conditions.
The recognition and safety factors, so key to quality, stem from less than obvious
characteristics.

Other examples abound. Let's look at other interesting formulated market products.
At one time I worked with a small maker of low-tack hot-melt adhesive that was used to
make tacky polishing cloths for the furniture industries. One day the firm found
product was being returned for inconsistent tack levels and poor color and appearance.
The firm needed an objective, targeted part of the capability and effectiveness of the
cloth to satisfy its large Oems and its Iso registration requirements.

The "stickiness" characteristic of the ended product was conferred by the polymer
impregnated in the surface. Up to that point the firm had relied upon a "thumb feel"
subjective judgment of function along with a set of deteriorating retained products as
visual standards for color and general appearance.

Employees thought viscosity of the molten blend at discrete stages was the key.
Measuring that would call for some expensive, high maintenance instrumentation used
carefully at precise intervals. But I suggested examining the process more intimately with a
disciplined experiment designed to see which process variables are clearly connected to
important end product characteristics.

As it turned out, viscosity was a weak predictor of capability and performance. In contrast,
a low tech, reasonable exterior conflict test proved to be far more effective. The
company saved 00 per device in test equipment venture along with daily operating
and maintenance costs connected with the more clarify testing method.
The subsequent Qc testing agenda that was set up incorporated this and other
straightforward measurements. These methods kept costs and capability in line and offered
deeper comprehension into the impact of process conditions and ingredient proportions. In
addition, the conflict test provided a vital element of the objective "process control"
requirement of Qs/9000 (now 2000) guidelines.

This new approach enabled the firm to gather compliance and satisfy seller certification
and victualer audits from large key customers and to qualify for new business.
Is your firm, large or small, easy or complex, measuring the right things for the right
reasons?

2. Haphazard Sequencing of Orders and Processes Causes Waste, idle time, delays

Everyone knows the aggravation of living with the limitations and constraints of
processing equipment. Only so much quantity and collection can be produced in a given
period. What is less clear is that there is often still room to adjust the sequencing and
conditions to maximize throughput. These adjustments can even go a long way toward
fulfilling the "lean" goals of less waste and idle time. This requires at least a little
disciplined logic and spread sheet figuring over and above what a visual scheduling tote
board, grid diagram, or other types of crib sheets can provide.

In one fluid agrichemical operation I observed, the purchasing manager, the shift
foreman, and a planner/scheduler all had their own schedules. No agenda was
compared with any other until after many output orders were already launched.
Which one prevailed? You guessed it, a fourth in-plant "on the fly" compromise
schedule based on hunches, incomplete facts and last minuscule corrections.

Other examples can be found:

Consider two consecutive interdependent processing stages with isolate process
durations that are somewhat predictable in advance. The sequenced stages might involve
agitating, heating, curing, drying, and the like. The duration of each will vary depending
on the raw material in question, the conditions chosen, even the weather (as in conveying
and melting wax or other substances). This means that for a given job one process may
take longer than another, while the reverse may be true with other job.
A producer of dispensing engine beverage powders was experiencing order stack ups
and random waiting and idle periods as the sequence of different materials were
processed. As part of the general process overview, I learned that the firm had a twostage
blending and spray drying process. In this process some raw materials dissolved
better than others, while some ended blends required slower, gentler drying cycles.

In such cases a good way to sequence the jobs is to find the job with the shortest
predicted process time for whether stage. The job is then queued up first if the first stage is
expected to be shorter than the second. If the second stage is to be the faster one, the job
is queued up last.

The process is repeated with the list of jobs remaining, filling the sequence in from both
ends toward the middle until all jobs are sequenced. An optimal sequence minimizing
total process time will result.A minuscule modification of this policy using improvement factors can be used in cases where the second stage can start before the first is complete, as with split orders, drawoffs,
and partial batch transfers.

This approach does not wish costly or sophisticated software or more than visual
comparisons and spread sheet level calculations. This approach also proved useful in environments such as a melt and mold sequence for wax slab preparation where melting and molding steps can alternate as bottlenecks,de pending on the grade of wax and other conditions.

Note that the objective is not great equipment utilization for its own sake, but better
throughput of job orders per time period--what generates earnings for the firm after
sunk costs have been committed. (No one has ever been paid directly for how
uninterrupted their equipment usage is, unless it's to decree a bet!)

Are there process sequences in a premise you know that could be more productive? There
may be a way to get that productivity without equipment upgrades or expansions.

3. Ignoring growing gaps in the middle of aging application software features and needs

Everybody's got software to sell nowadays. It's tempting to avoid the complications and
uncertainties of changing or upgrading. We have witnessed a deluge of specialized
offerings fluctuating from erp, mes, and manufactures specific Crm systems to process control
and data acquisition systems. And now firm manufacturing intelligence (emi) to tie
the loose ends together. It can all induce severe "acronymphobia", not to mention
confusion and paralysis.

On the other hand, tough times wish that we seize every benefit possible. How can
we do that when I/T budgets are shrinking?
One approach to consider is allowing an unbiased computerized devotee theory help with
the decision. These devotee theory logic tools can gauge the gap in the middle of past, present,
and future needs and the capabilities of the applications in place. Best of all time and
assistance can be rented on a per task basis, keeping the venture modest.
At one time, I undertook an application software estimation for a M metals
processor. The firm served the automotive and allied industries and organized its stocked
and in-process materials by attributes (size, weight, length, width, grade, etc.) rather than
by sku numbers. The Oem firms using such an "outside processor" are essentially both
suppliers and customers at the same time--material arrives, gets processed, and is
returned to them.

Each of these circumstances minuscule the metal processor's capability to conform to more
widely applicable manufacturing software packages. Use of the devotee theory evaluation
tool combined with some extra explore revealed that an manufactures specific niche Erp
product for "attribute based" metals processors would work great than an initially
cheaper generic Erp theory that could be costly to customize, adapt, and maintain.
After option and implementation, the estimated savings for this small firm was
well in excess of 0,000 annually when compared to old methods of tracking,
organizing, and executing material processing that combined hand-operated and older, less
adaptable software. Savings on customization costs, both planned and unplanned,
obtained by selecting the specialized rather than a generic container added other one
time savings estimated at 0,000. Of course, no one should blame the client for
declining to run an experimental generic container implementation in parallel to verify
these savings!

Is there a suspicion that the capability gap in the middle of firm software and requirements
may be growing? Now might be the right lull in the firm cycle to find out and take
action.

4. Needlessly Prolonging Continuous improvement in Two Crucial Ways

The batch flow or semi-continuous process operation at first glimpse seems less amenable
to finding and implementing improvement opportunities than a textbook discrete "widget
making" or piece part operation you may have seen. Pipes, vessels, steam jackets, or
reactor chambers obscure the admittedly identifiable avoidable waste and "non value add"
activity, and crews may tend to some activities spanning some lines rather than
distinct work areas or cells. Yet, reasonable bite-sized, digestible improvements in these
environments are feasible. What may make more sense, at the risk of committing heresy
against lean reasoning or other improvement philosophies, is to move directly to a better
"future state", rather than optimizing or dressing up the "as is" state at requisite points in
the facility. (I will assume a minuscule familiarity with process improvement vocabulary here,
but the ideas are straightforward.)

To illustrate, it may make more sense to skip the work area neatening and straightening
and "quick hit" efforts called for by work place clean up ("5S" in Japanese buzzword
speak) and "kaizen" incremental improvements and trek, right to the new "to be"
process setup. The existing human work space is external to the process, after all, and
may even prove irrelevant after changes are made
A hand-blended, barrel and drum scale stamping lubricant blender that I assisted made
the leap to an self-operating mass flow meter based blending system. This mini-scale
operation found that neatness and reduced clutter plainly followed in the wake of this
upgrade once the frantic pace of hand-operated rework and adjustments abated. It can be more
heartening to the team to make manageable, meaningful changes of this sort rather than
trivial or elementary ones or to make pointless changes destined to be fast displaced
by others.

A food microbiological sample testing premise client had a congested arriving sample
area, an immovable, fixed walk-in incubator, and other sample prep, scaling, and
sequential repetitive processes surrounding it. The premise needed a way to comfort the
receiving area choke point and get samples processed faster to service requisite customers
better. Through our help they found that adapting continuous output flow techniques
to "pull" products downstream, eliminate unnecessary unproductive movement, and
properly manage incubator capacity and timing provided practically instant relief to the
receiving area. firm was solidified and billable volume increased, while workers got
home earlier and avoided burn-out. Time spent on sprucing up the receiving or first stage
processing area beyond extraction of safety hazards and some basic mistake-proofing
would not have yielded comparable benefits and would have postponed genuine
improvements.

Similar benefits were obtained in a small continuous pasteurized fluid beverage facility.
Here, ironically, hand-operated clean up paraphernalia created more clutter until great planned
continuous runs of product reduced the frequency of changeover and disruptive cleanup
steps.

A second cause of prolonged improvement initiatives is attempting to painstakingly
streamline every identifiable sub process node in a larger output or paperwork
process regardless of its impact on the whole. More often than not there are crucial sub
processes that govern the operation of the whole process--either in rate of production,
build-up of work-in-process queues, or operate of quality. Zeroing in on them will help
limit the risk of creating locally optimized processes that don't lead to overall
performance. In the lab case above, speeding up the rate of completion of final testing
steps that occurred after the incubator stage, while possible, would have only tightened a
few isolated downstream sets of procedures. Improving the batch handling of samples
through the incubator, on the other hand, made a requisite impact on the overall
effectiveness and efficiency of the lab.

Even more dramatic examples occur among plant floor operations where clear steps
involve bottlenecks or "rate limiting" steps. finding all improvement opportunities is a
good long term continuous improvement objective, but only a requisite few may yield
success in the short run. Are there a "significant few" areas for improvement being
overlooked while the "trivial many" are under study? It may pay to take a look.

5. Missing Ways of Bridging R & D and output operation and Data Domains

While discrete manufacturing environments seem like naturals for integrating form and
manufacturing what with Cad, shop help drawings, and tightly orchestrated
engineering change orders, the situation is murkier in process manufacturing. Complex
formulations and lab scale chemistry, biology, and material science can lead to many
loose ends on the way from R&D to output such as isolate bills of materials,
recipes, or beloved formulae, not to mention isolate capability and performance
measures. Scale-up pilots that show unexpected cause and succeed relationships among
variables and observed capability or operation lead more complexity.

One confectioner I worked with experimented with multiple methods of monitoring
chocolate liquor viscosity and solids attention for the purpose of controlling and
adjusting end product characteristics. I worked with them to form an open loop data
communication link in the middle of the lab and the liquor and sugar tank farm areas to make
adjustments in full-scale special product runs. This approach avoided the typical delays
and lags that normally wish chasing and catching already processed product to make
changes. Eventually, the close link-up allowed R&D to fine-tune the in-process formula to
obtain a richer, smoother mouth feel product that commanded a great price and profit
margin. Later modifications were made to Qc testing to make spot checks on the same
product. That effort brought Qc and R&D into closer alignment on their methods and
priorities, bridging a gulf practically as serious as that in the middle of R&D and production.
Careful combinations of procedures and technologies can bridge functional gaps and
cultural "brick walls" that can seem insurmountable or are plainly "part of the
landscape". Might there be a few applicable to your environment?

Operations improvement in Small Manufacturing - Five Overlooked Areas to Mine during Tough Times

Heat Pump Compressor Troubleshooting Pressure Sensor Transducer